var/home/core/zuul-output/0000755000175000017500000000000015135527531014534 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015135533276015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000224727115135533221020265 0ustar corecorevikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD R~4i.߷;U/;?FެxۻfW޾n^/ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/P_]F@?qr7@sON_}ۿ릶ytoyמseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזLwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O %VO"d.wEр%}5zWˬQOS)ZbF p$^(2JцQImuzhpyXڈ2ͤh}/[g1ieQ*-=hiך5J))?' c9*%WyΈ W\Of[=߰+ednU$YD',jߎW&7DXǜߍG`DbE#0Y4&|޻xѷ\;_Z^sнM\&+1gWo'Y;l>V ̍"ޛ4tO,{=hFѓ$b =D(zn;Y<1x~SJ^{vn 9 j1шk'L"cE=K]A(oQ۲6+ktwLzG,87^ 9H\yqū1)\(v8pHA"ΈGVp"c ?Z)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7bd+, Z`ͲH-nမ^WbPFtOfD]c9\w+ea~~{;Vm >|WAޭi`HbIãE{%&4]Iw Wjoru ݜmKnZ<X; ۢ( nx K8.|DXb +*598;w)zp:̊~;͞)6vnM!N5Cu!8Wq/`FUwWAֻ,Qu W@ Fi:K [Av*_958]a:pmQ&'ᚡmi@ zF(n&P;)_]µ!doR0`pl`~9Fk[ٺ+4Hhao-jϸ??R<lb#P-^39T|L /~p│x@Bq"M/lja\b݋af LnU*P(8W[U6WX ZoѶ^SH:K:%Qvl\b FqQI.ȨHWo;Nw$͹O$oEE-eq=.*Dp,V;(bgJ!gF)892sw*+{[or@x,))[o新#.͞.;=fc<)((b۲Eumw峛M2,V[cm,S~ AF~.2v?JNt=O7^r.@DEuU1}g$>8ac#sĢB\PIPfwJQJ;Qxm &GBf\ZA$Ba-z|A-I @x70 晪MV)m8[6-Te@`E|=U D(C{oVa*H7MQK"<O%MTTtx袥:2JޚݶKd7UZihRk71VDqiގ\<:Ѓ3"gJJčE&>&EI|I˿j2ǯɘCGOa9C1L ={fm&'^tigk$DA' elW@Tiv{ !]oBLKJO*t*\n-iȚ4`{x_z;j3Xh ׄ?xt.o:`x^d~0u$ v48 0_ | E"Hd"H`A0&dY3 ً[fctWF_hdxMUY.b=eaI3Z=᢬-'~DWc;j FRrI5%N/K;Dk rCbm7чsSW_8g{RY.~XfEߪg:smBi1 YBX4),[c^54Sg(s$sN' 88`wC3TE+A\.ԍל9 y{͝BxG&JS meT;{З>'[LR"w F05N<&AJ3DA0ʄ4(zTUWDdE3̻l^-Xw3Fɀ{B-~.h+U8 i1b8wؖ#~zQ`/L 9#Pu/<4A L<KL U(Ee'sCcq !Ȥ4΍ +aM(VldX ][T !Ȱ|HN~6y,⒊)$e{)SR#kהyϛ7^i58f4PmB8 Y{qeφvk73:1@ƛ.{f8IGv*1藺yx27M=>+VnG;\<x7v21՚H :[Γd!E'a4n?k[A׈(sob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#6$g!D$c=5ۄX[ു RzG:柺[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)ۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&A+mj(^>c/"ɭex^k$# $V :]PGszy".zɪ) ӓT)D:fci[*`cc&VhfFp佬)/Wdځ+ uR<$}Kr'ݔTW$md1"#mC_@:m P>DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"hK{BFEmBAΛ3>IO j u߿d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^\ơ' .gygSAixپ im41;P^azl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>ajϿ '!9MHK:9#s,jV剤C:LIeHJ"M8P,$N;a-zݸJWc :.<sR6 լ$gu4M*B(A ݖΑِ %H;S*ڳJt>$M!^*n3qESfU, Iĭb#UFJPvBgZvn aE5}~2E|=D' ܇q>8[¿yp/9Om/5|k \6xH.Z'OeCD@cq:Y~<1LٖY9# xe8g IKTQ:+Xg:*}.<M{ZH[^>m0G{ ̷hiOO|9Y"mma[sSbb'Rv&{@6; KE.a\}:<]Oyve3h9}E[kMD,5 %sO{킒 8.K?]i/`׎tp NvԻV4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ xg{z%ǻViX~鮦w35QE~qp[ʕ@}ZL! Z0!A⼏q)[f &E1K3i+`JG P/EG 4 9LڑKL|`PОnG#|}qOR{Q|2_tH߫%pD?1%(@nfxOrs25rMլf{sk7݇fjӞh2HkeL'Wʿ}Ƞ%>9cSH|cEyQp 'ˢd:,v-us"Iidw>%zM@9IqrGq:&_p3õB!>9'0LL]M[lwWVR9I5YpVgtuZfG{RoZr3ٮr;wW:͋nqCRu1y=㊻Ij z[|W%q0 CJV٨3,ib{eH7 mҝ(3ɏO/̗-=OR\dIoHZ6n`R֑&#.Mv0vԬ]I˟vrK}F9X|FI#g.Gi)%!iK|o}|ֵ7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb/{&Ά+4"^';͗f3kZc7B +mmBz-p޴&Z'Y B6 [4 u>r,8?.7uCC5`F %Ն R Cu8?28¢E Wi; P0:"=nlMOezR= ]8â6 U`V% CQX #'Nv%j1ܒZB$*c.)H ?`5[Z!]nliצ) ݆Y~ cPwίwX"{>9V0Fp*ITtD_GU9V qN8V ['mU)d!FmaX-6NAGT ,Tc3ѱ ԶJ'{|Xɚ(:FM/3~'*v%bq@@ʗrǚ8ce);F74~'*z[\M-~vg&?  p)nNS)`0'ϠNa%?t"8U [tJf;z^>%QdE6c>Ql~J6J#`~#Eh3ŕS,HVh7m]Q!ӥSVL l)veZ`|p8!81,w:$qU>Ѫg39Yg>OF9V?SATu_DjkT9F-)L)ŘF"uVK j0=7Űrԋ7EDYp{]q`Ȝ/; 9a>E)XOS%EGƶ*"=b'K=+:)tpՉ8Y4cT~$)*417l;V 6AFĴ{3Gl& Kq+%*?V6N{դ}Y(; -.DaPqobb,v 2w s1,jX4W-L!NUy*Ih KmmlTcbc[O`uxOp  |T!|ik3cWW/ @a#ӸfZO'q$|X'>דzY'Ź$aZAfSA'2ϟw}{_Xı^Cv'A$Hi% z}׻K߉J` s"Q 6]c_:Yrx+u6*&>V.)W%GX +Uݵ"3IVnۡ|< a:U`{⧘pvzz0V fNǖ9dɹtً}q| +=i98V3 %~-OSx6a3$T~ꢰ*'3IapR( $?uk| ]bP\vۗ3g&_I_Su>o>&l3[mBPIF5>H~w!)h{]/oWxwWWwnoIK(en\.oVV6n.Džօ\_ߋífz,/o/Z^]ݖrn.*upw pX\_ 㖟A-? [zWW,/:nY_s$-]ȶM_u)O_x DVim_9 hxh2 Izb.E)բA l1:YɊ|ZHOge0Ώe|Bf"m\ Ru2 \ )@+q(7}* h% !< 5IDK5AhDK֠]4xMvXZhX1xҖ51Y njsa*&0Z|ђ8 CGV[4xI(#~ t>lf@ Vi`D~ڇAځQi5ʬLHuona_u` !>Z;I"9YPۊ}xb6Ia{ sckG,:\n%Ld="K-us0d/N#X.?_םh 2r-'OID3d ֺ"?yUAmEGr@!&H2D+vVyKZt<c$k#xu;ʍ-`Zi3)|x!6!%<@fpѻK2Q31pz}(.>LC,HI~'.ObKjoJdO UDp*cj|>z G` |]}4:nq!` `9AgPF_Ѫ2)sCj1T.S0Z1:?W~>egI+^bK?&#I3X\WGZ3M`AI.O6xm`Rs _Jt@U8jxɕͽf3[I3ů,IR Ř`QbmүcH&CLlvLScivG'հu7Τ!ljK=ct"*tk dRM˜6Uǖ_ iGG\ m#Tvmەv|YRiĹ zm Ɩgv{Xn|S5E O0][&8NJPq^H2Զj{RC>he1u CWL;oG^\ X5)aRߦ[_Vs? {bOe@0o2cވu~-B竟} |23 Z.`oqD>t@N _7ct0D f"!!N\-8NJ|r^*A cigst{7=xLOd[s)t|2L++XX߇ ѱ LKV:U}NU*7v-zߞ +Ōa`HDN74Т C>F}$A:XBgJWqLhnٓۓfl8fp*CDrc3k.2WM:U~{N_>!Lc/rK-vv%~ =WBX"XA:#u-9`x 92$4_>9WvT` d0ϕ_\ Bؒ G.}ΕU&4D&u.Z9c$A$Dfj-ء^o$#OȯTgرBӆI t[ 5)l>Mdc5u= A@kU#cJpրj6M>dUN/aF,M!:Y:`ƀ$56%}N'OKx%'t+NBJp`tɪez8|hUui')gXٙqUhhC%&U0SUuvͦY;c؎( "YJߗ]oc(atGS1px#S]MF˦NJPYDX%ܠꡗwhl}) 8ZFKiĴX%2@Ki+^mNf""Ŧr ~[0.,Xe TpUZ/S[*HyVIUI':ilL t,O669Ȁ,EʿkڍfC58N!j.Pٱ ҋ9W@/i´KxA|HkĘ_E6=W|$O -{$1h.A`AWO@$52F$Rv%S zp&mߡe8 "(O+:Y¦={2]I)|Өq9| 76"Q;D*04Zٚ ?V ͼrKkΤOq,*kI|0]Sju&y[V zS{X6"Xx#I/vrʑ=J4fG(Ȋ1{TT%4?mdG_%x';UZNxgr5Mk‹lϯ_dٱݲI3E,|F4xpcM=9B&Z (˸lɖdY;c)J^UHe:ޝU .ncG80V(+ ޑeQƕ66؂cJ&L.!$x ASV溹ӕu'n/yI!U`j?o V T`jGy0VXFMb?h%iFU<o"C(4sR+^Gf0(h2.c;U\A_`#\ȱ^`47F9虎4cZT3`Z8 =΄S>-q>9Jr 8=2KV.)[a{i2tfq1/c\+e/C=iQ9Y^֨\Ӱ#?w^֨,ǐvd/kT%Ban rH?iʹc,ɸq^{hE^3_wY9_iq9ݕ z?v\paO-Z>h h1}w7&+\O-Ƥ؋L/Ht& /3$w,%_L ' .R C\yxIr88B%C4t!tG{'6P! }Ahju;`;anH;)/fK[ӵց&>)dȑ*ds|6\i_.,gjov_y5GaYQX ~  F&q8yӾoiygyL>?i2~\]wo/}fP! }c|#~d9/|Ò_"t_2oK`xR +7KrhT*sc/as#Nn{d"oA>ŠЁ´ .^Xl@!"Uu~*/5m{ }g5Z+ʼ+*˟̈lsyH 3-#CIaeَ+ o8ܱE gf̷F=<y3-`v]8d"+.V2EtؚL' ZYuUyi`Ka`8/ uПـ췄'VƩd&<A^) RgW<*'"o` O1 9 ȬIUȰ<Ƈ0WQo?Xs)'.׳{+ռf,\b~Kg]jWtWiMvpK2"~dA"!M`W,I;궞Gzt(sqeM wy'&דs9O8rݝ.%OORLk?ͅ$1wyg1Lk͏[%1wyS#Ă8ĩHP,2gNNwtػۻ:oxx$(<9r޳_9&!xN9g}O}-'Ȼk M0"~;[ưt?马8$OñݸAgM݁3̓|Txt[qeKCnoV!)5rE^}pJNPGV AQK.@45ߝ?9u= i>Ι%a9>?49R.)CNtu'd(3|:y{L@P%g8#~>9@v>e݀vj:A|z3 v##v_է]E[ǦY|}$u|SҠ8hozDœ@[⥸%}^HIϤ@󒾀{.i', Y&֥,Hgo/8xQ- vkư,B>eoe3=$[5Z\0(ɫpfoxF@I@ls7QomÓva;JMdyweeؔ`? e>>[/ɓzq AVK ^Kq|Gu`nuk } V+ z{4'puA 1jItQ;(}%H`=}ģdIӃ%7D!$( ^:Y'P@lyK̓ɇ J8V#!3ؤ9GQ?g8X>k5MB~xOH|KA l+)&x$14 _$*fyvMUT_4jEu-NZW$Ni63 #EWQ簷jW2XAēȂM\Z۠ThH-Xyuxp.l+q!`6:C膒ReA\,*/ok7\[4@ր\ʯu ]. v_OiJޱ|Fj۳+G[g$@K|?l Dm Œ\xÌF݂2Z`t͇0O% EQ}{ q'Jifö v8Ẻ\Z+5)WCk U2Rkp64!D͵ݭ 49,(mNs9:V1g [ ]w;!Ʃr$^ oFy!kXfy*^.Ѹ l")yБ„MXGBFIC"}YW"UY,=ӽ&Ԟ%q-)uԾ%#M)Ywc'G\~ ٷg74ʶbCqaPۂX*8}x\se0 0mM'48i=IZ wX]BCޒwcugJ)Q-LrQBLѢR~md6ITTl ]-0m7@.FZbuq DpҸ i61,$ xj7XI)@um ̖nhT&βoڪ(UTNb5Iɶ>%*}yϓӌQƌacκ=$V U;12EUt59.Vn&0M]O_b!SYޞZ~ʹ =l3CI,3Cb,UMZF۳T]@"qGrEW=E7CpTݳ"O7ʘa1Su-Uc:,I#*-;FA(ZkJ, 9SYw͵UoGfʹ֚oo(>oVݫX"5x{ dCB#s)a\BMa{%2w 1O% sʉ3PwzNXkZva=i(|J\cOz,W ^ݧ;xpB3Pzx{hZŋT,ÕT}/ us9!`a{vs~㘖W ؝ ?j ð鹌{ו㤄<-1%>m"}E=b3Ob:L։wʈl70H z~D,31#[i#Čƪ=` ?65VPz$@ TٴG{MvUq;Bqx^C(/~;݉B9{Z<{_|"nӠ`1а >_49<8ߐ0tPA >e " hA{#R ,O`w-mILjSw|1NHei%+إPBL\T=H1Ď%tK3Tke\Ú% FLJZ}.8MqF4ōnC'-:߳yc>B/KU>y.>.[ڜ˴QU_@GgޗO_Be> /6Hd*l[2) &G:}:?8qȿ(oh# lshp~i /uTW"<6sIBa-lġsm9P6b[3kS(:FLjkhV^*'4nOy `Toݛ< kK$Ij68os'BiGBi P7 넙(ҖHP(Lcijm@̷v[ Hæqiyt/M2@r>|@ĨChS@4mE"1/Zh ȢU Lqm `D"W GT6s0p췧 n, Aid7JE(m QAP1k]8ZFaMYKkG*ban.$hԛ!OQgD#fhPʚ&p5z!$_",\V!9FwSe qk(@X%7 %ux/=W0:a:-0F\J԰TF-T ѲN{Zm eB*<߃Q}yzW7w,RgY='xnV9-"]N5&Nj TN5 AqAmԈͩƥrRNDE&=N! jPdqA}5֦\皭W1-N^4igxL3/iVƙ/9 gLf#9%ѯEEMCNlCunn:U_M iuy҈Uin ni,ޤ-s,2Eryۃgal6:[ sn3ںS%Zp!~o*58 uc5&b5 )m f#IkP25,k7NeWqf*M'x^Zu5z6W`@K0Ek@&Aaѿ (˫`I{7J$= f=xֺ ޙK㘪.μ kvߺ0t0aq9㢗dR`(FJR%v8"n|~$Mj.5M;O;~tYA4#B2e1-@SLkukƴp.)GDYF܂SŢcGj @F@+nS9'?<D- PDgT/d"Ӱu1R^OwGOepnMh"r|ة np>D5QS+)> 4`Rs`!X\YOβbJ>7}MM@;ڇӠ/~S )-[z//>>`AXǨVyB#p_p少( ԻPҪFwtxͰ;r4ߡ=9P 4q-w"'Ώ-# 8jT(`nWa8k-\. iHѪKҡq8@\22:9v).&wœCإ&ˣ &6 x5 Bvʰ.Kr/*AS]9Z1ašh7T@4]ѥV48ɭ/::^}p)B~3Ǹd0CH{~&*OdXDa\pXc:fYUO.o*~ 0&a }`RՓBgE'XtS^fhE?7O1YΦFͪ-&Ib英S/v`ZLGp5%giLJ 8Cq6>@#4KVQ ]Lp0] g@SPYI<4p-;63bJP<QAtҧj'gEװViv!ʦa˥\$\5m\ߠ" bR9,'0Ʌp H#vm;"NwZ9P釅kEa¤ǜa<+ljuty~}bAic <1IGOYDzqt_%?E6kQLٌ $"=[]5W>_YqaۖIw g ؋jkQK)V<~t0&fy,+ćI7Ȼ=(BG? @1!E8eMg>~h KE=o=q|˖ڋXe+BVwm(9Cosv-7hֱ}X>\;ܶǗs\(rks$vGֳ͔ݦ]8uǓh;<@g*-f? |gχ;\|&%GUoi]ldRKtYɧ8G Xodnty9a#AkӠ73#꺤0E2EkET3e5(`s]hAFӀeR_roUsAٲl A vm#&e۴)ۡQ [ڬlv}ڰ[.MҶݹqi]sYZs ae5_&Z˂Z[jm/2A^BP{{A j(β: LPgGA-˂- _&(QP LPwGA-˂- *^&QP޲z LPoGA5.)rf _!b@_.%B̨asw4%p,zpM,G]2fS^g;!oK qSz:yyYCѿ$"gN=ڵMОًGޕݻx4H8?-м?SrLɇ$ˣyvdwb7@it9VF/lW'mb$>&*=yKBA]~M_LJY &1Y5\񙈇B ac$yd֩!;,^@)m{!Dj6 AU‹!|o5|0k=2@*'=EȔ"oNe[Qe~&rΑUct {=aǣ{GلL',uEgIT|H l6|h|b.d aj'RY0`?95άb/pUED׾{âkQ)2׫U4pX˳m<+9[L*}U*sRzXk p:A<;,n;lOf8/"/# rePAHfVuFgeΜ%tL[`n-"R-s؟FcRDR*+ D2D" Qf~ٿjLq=0Ԯ+ŴJJ^TP^/s.;)OdGUHu1|Qf ltuP2꫅)ySPELXyb;I7Agӵ{Mv4 "WD Ŕ%Wb*b~} סx8s"A/ 묟6E(/|P [ꓡ:D2-1?>yOVd$ p,E'^BfYa=2jJAXөj: 'êf$}sE(ҬS/)9y}7ׯ;uoѣߘj7&Rc`=GG#Qm>I>¦!O9(#t? "%F,Au:8]Z$.r;FkD2|m??X#e}WVhm`~>_RmH$C$dŴ+w:u =0+q]V[M$EX$zpew]GmpJzXLo$58&y8҃d6؏Q3jbqX+Pڢ"#M&!?n#0[ "vgОJU|\ G힟Nr&O`h}7D,qH){=U ,/-YgY J]7JCG8M¾xX'M6;~QGԇ xcSqיVMiح2NQ[zϘɽjsɢ<VՈy>I LЗPdxۤIoX\U+} K` kD<7GX>v?2ׇ9g#%C@C9JyDFl^^u2ٮ;@90ʫv*VNm_/\^ÉP,?OI>K5L%6,++\%8r{ \-<2/r)g$|6$cn@GX3!HZr)N$'8@$k?L)RFq<.3)N_Vy ,cA^x*Txc@1Zex1RY7ꂂ(ʿRWi2_~x}ՏxCI~wUWu0ֈC;7xk:?ۆ:_*Ex'Y 2Uo_A5m ӳVB]嚢!hd#9|:R,b8O‡~"<ݨ3d^43qq^"]U]4YUeeC9UUϝnU.RBɻ涒Wfqj*JnIG!)֯DA%-  ?|h'Svܥ ǟN$]<86ѻr ޲׿Sy>ʯyĿ~d&kamwͦZ@{]M:&x:|(3Hf^9?g&vK5# a'{|3A=DѾA4zWAxzكZQO9r?4yޔj1L}jK_jJ\LK^ዑ,X(Az̄Y,mP[:1!up\p;1jO/l$n*q5JJ#r=k%eńo~ cT.YǸÝ'Qu@ҢkVL,TaMn9ۄc; $":,5&ۂT!"2 r]GU d?Ѻ2edp>gl+\c XSs5)njl Kz^ #EnKx콣Q4TyYSZ+\ka3 &_i«yPdF醊%ΩyQ4?,T pfQ(s:%BS|칑ZZ52 b xK1gwELv*l>ĨA Br-5bjZ8} VxHoܫOł4L8c!,];8uչ{GԞ[G%$s^˨FvFQdc;!'ucCkӪ޼ m%kAZ&W=":Ęa >@[/O&d# Ԫb2)4rGb,zanx(4cW4Q`K1I|M1i7eq O\7l73$X$Vi V%ج6'f7@10G&bG^4*qn@S b0!Hi iCoKZpEhZ$NdAH(nDY#G] ŋlfHp]Gɸv ʼ)aEbEc̼AmU..]~tM~k$1$FWE8F`G {ʛ/i'QcNy]pTBphe 1X^-XZwF<}!AКj9>0Ƌ\rl1A\ձ|Q'eXx]QhE\QB0gQƼ(&WUݼ%vt#Lix48irTY#KV fhEX#E wp9Is[NkLd9)9D;ˍwU_(yu:#Pz[>S( =Wu9D\7?>Yx1BzLJ{4iB\5{(4[1 YH4K3C{s$8UMNgQ}Nq|yIpn'R5$ǝWuc!cjOXyp\jg!:8(!Lk~Sr=#::R $Z|tYDL36{Hv_owOݽ,n%3 /tI̪0á Ҭy-qtOil%j)Ij\*!kZlnէ:cyyӻv" =I=D+Ծn^2VvVV,jX5C x",]?^Hp5׳lFׁN)>wvpG3*H!hNNd #3ՌMMo"5=x)pt#} v8#rJ o}Aꈤc,}eWyHq:O$%m\]Ft !Hp<M+D—z%u* 1se.Xp!j!s'5Oade+`"ˤ5Rւ"`,]\lGhԦ ӮxoSKG<DX1Imlnr~].$0'zUTICU: Hk^.;c*Ӽt^?P\!1, C0A\iTᦘoO@Bre%G٤Fѝ aKp2g5*8<{XerKZ$js-2NwLZcUg %JsXSw<].:q8wO)=c"~x=5Wf֗fQHCFx 7i/:Hgc. pH ?k>xesTǗXƴո{;yo]uU|ya'1gU8b%ڛ~Rhthl6j=B7H8x+֤)k8 *Rc{(j<6W%0Tc$s%{P|.d&s3߼?Ŏ`:9`AU/\< vl"$ 9&˞XB'[RCĘaO1?ĻEݰǸ6%Xd$-L{D+gBjb,]6y?V$8*{Uq/f$uN$Raŧs%x65K3ac4~زxZۖ.VT,$ƶj("OK|GJrBHB~A9{n8 1GOENh=\x%/d0Zb.u"-!׃L״Xe~B`\'0^;sSS(&oN!9nI\t镡-ˮӝzV]+NW4 .eb=|F pjX}ȒF{S'>bM`诺jEwg]3BI_8f5 5W-1RLWr` шRt)$%rl,Rs5)njz$g=Ag/"%02+sYunӲn}^h3 .Pwyڹ? U=6l-UJgҗQHֹ&z똉xuV z.S#dz#qps\3(ȋx1l Ɇ(ث()x#"XM-jTpfXU]o6WM@PNa(>&uld]ˏvd;Rb:cyHz)6nqfvJ@ Gim12--`/>tlS-liǫa}H }w{8yΗ;͟}vTm4YwlOzV}Iq_3S< &JD3iޤ2>p/{j9U6&!fKuxcD@?ԏOVCԠkj _JدI緉GG$S"Ox|E|I! O%pG0G+&L^8k SFpQ!ir %a?O{w|>|!NAW6/ߏβo9:;k"Dێ/j&9xThQ7?Ptޞo?c_ Bo;}EѤYmW}95?<3BgQ#eBh,~yv1?т,x$ɯc|45HZֻ+D#pm'`iCsZe]KDs6LFԥ9SsQ3[vg {:[6Ǟ׆/0) y~W & Ē(Evނ&ar$?>ҫ{o~|'{ ^&\~eYy Q1\@Mk uF'\'''o j(P? |ţP˗30I?d]NI$0 1p8}_LN3>'6'lzb:?E~N'TcD=Zc%RxkC" T᥉+.y磼Ғ-{l?\F7tWV]ыF ~=>Qw]Hks,<q#Iwhz+㈂,&*E{瑟t=䣫B,c!^-DY(ՙw1kү텬qff7.~F?ƣ,]̊#Ȏ`샻;]+P(#~u9W"7.*s$AV;Ι¼|xUڷ-iX6X߇ pi̙*I>*l02@𩽻 HXnȸ&p96EhBRW1i',=lcrqw[(VA1_ ,rpp x $_UjWO W㥫SuߊVXckDZ]%Bvs*'TC T>R' _)I?ٵە$^ɱ)fmV|wvKJer qrh \faz?WŌŧg=wYsϜl-VZYz˭DEL IGř ݤ{_:rF<6L i!@9& T07llSfXNy FHDozo.mPtK`MiWa!p9]h~ĆsMބN ZwAYwwJKڀ6(&F{[o7yFEP)W!)R I)6p)mV^"_|7]aU; 'U {ഊ䫟ٿ_}f~h2=I'X3R:s r ˠfM~]ԲUĴYm]$0,nXbboL(̓RT[) NJ w4knXVmֵrv[+$?SwBE[߭B߷b8w[9H^yFrikRf\\9L umD}6˄9yuZ]dۭ:ZrKH *dp,g61YcO1̚|F9ܨ`.Yy#d; Z~NCa^`jD$Ȣx:\Pga~uA[=֬ia :X  mokhٿ=Ѫ6YmAKJnw_!cZle8$ @πR܎u}jK_Ҍu(*)ά l}@ > ZSnr>~ M͋uqIa#b}wnvfåb Zj4[޻8X5}1C#ɞF5Wn$歲:g(^iH:Y)#3Ot# BV1 ][q8'BSg=jtRA㙲5?|S^3sTXs1/*.E^KVF_b2jvF_U0}ȗ]UF_!{yUNh˄bdEڡ:OsЅ'7w^W\7bILNO .BvgRXfh|ufs5Ф.Kk'k-YK9E2ܕ:(;Rvp`1XۮB~S'6dUsH3mmOcGI iǓAf?|R\*1A[L*pFBp8EAGT`J3{vCAb~ysn5iGŵjf:H eZY^NziaϙkXITAl1ۣ[]׉S$4%R N 3( Zxm;SILDs#XכvHכv@34#=^p"P]3HE S;M X=A(CHH $ZBkFӎiF1{6o%-?} %BWILv݊UdT(E,|A@Č@σ Z4䭜I+66kXS m iHyR1/>E xm#/achQ#hvQ3PfrT :@_)פzE;S5GF'm@t|'{IwoC/9SO#_3헃*rNQh$d5bsD0Jw֥<$z0^tUԣՊ:e15$O> 7K??vҪ}Nio@^K48}V6f'xp)+buVZ&;ysgpEuz]Ԝd|۾/yL=fHHT UpKQm%Id`#f/jrPJՎ"v0of?"7E$$#D@QsG4J~7I~ȝ7n ?`֧2cA)\G~4㹓NvK@s ¸ɷ-S tz?vq)A$m|*+2[/~u2B -ӵUُȡH6ީ o<aQO= !@#2 Q$>Ὀql&ҁkY4ۡa4Pj*n_$_78O,3`qchY.h NʍL3.W4)Dxֲ&nex*ɝUZJs86PM"EeޥFZxZb;2|MݝBYW򵘭sQl")Dӵfէ:?/KF;3/+"Dhv fv `QZ\㚐ukp A:ZFdH [ r6%`Sv !bp yח|wpxu\14R.CD-6CS+݁` &nk0 P}`g!p-^A:h9 LИȪAChB1(1 +Zőx6|\jQBHF@J 뀘J"ҩHw)))Dw"ZvKa]Eug!<0dmagabuMBlIqd;\@Bc`ܦy3ԧ%S\K<@Amb@X;|U vJIŊc;5Jο z~4m(L/z8||x⚈!Ҝ^R)Їc!{!?^+d_ ݼlvY8a<qU )<&cTۈy4t;Sic;i>oܱ(]p$PL̔BXpR21߄'8vLb{L[z 'GqSOEAyN 13"(dcqTh5z9_ Pl]" ~͹T` x@nv Vڃ36eJP$vqyPt)5VTJ4D v!G֘HŔ wI)D@`hg]Ӯ;)tf!( (' ʜ"L0ĝvFsŭ'9C8a4JOɟolhXoPoӌt˷iX|f3J[뀭`&lJ1 )9djF fQ \J8A߁Pᣖ!Î ?S&dPgD'!8p (8j8|qZSA* /w25S=>Ti@R'd7! vRI@C1a(x7IU'"Ƭ(S ZQ e lZHde 3mXNh YS ]? 0}Wbϥʕ%8*nBNKDSi"%&@2\dڦ<˃ ,@ U޵m,B@(! -79iФ8i.cȒ+JNܢΒDْM)rfwf(&6:NME 2O46*:(mّ|- ҘjțZN#$`1ӈ1K-KTiH=r pq uըjR(F(I-$Qے$I s.%)OMbJ '`:Sk2 Lh #/Mt&d2| u+I_:!$"A`-Iq$1 lccu7B)4FfSR 4C'e5b_#՘xZl)X3!0ɼEla L± "w2x9L"L!0s#H7}Z`tJfr{*2 ΍F .z$ kn 7s;6gd 0t^MK9'UGTH8RHۀ`yfrms ^4 l=qrRv#uMV˛J^4myad9dqy1GsMzc dԀ10%5jj}L3m%Q&eb2#Fq9;D%VȤ %z*6!*BjGlz8JxJ#Ŗ1xYCT 0?+Ō c,6x~I5|!?xXQ}y/@cγo!/Y3s((\9%uI79aoWc mr= u3ʯa(A uI6[N`a~(H%B r^JDGL 8!hF4݀o$mP Z[ьLiqL`u#rjع\s5&\>}p95nɮqݒQdfKwɮ7 }^͉֗&dj)hu#8!&jŕ8fɮNgh@Bh1g mL{@.55VbYw%d5a4w ]? ʪ5C7Pζ,$v=Y~O@)` sԲ`Ps\/M gĤ_8}U.;g8fN0qGG32`R)!{wo5@.. jHkļ1ϯ0" xeo,'5)*K ^ax +Q\haGaIOͿ:Q C'CC`6f39`"r};0)?;ҭM=ʜe|TVa>y_*[ڿVW$ [ETO9G\k/LT kZAMݛrI>` ѻϋOc0F.4xv-;`џ=({?=؟<c#jMg۫0 O9zNk&de\5݆x8^.F_-Ɣ#M?$c@M)8ZkI"׆=@56d@ȈiWG/wVTpz3Hܥ\#QWMTvЈ`J>~cG㦰Pz΅pP/|Ḧ́\ } 'QXGȔ5jziD־r'SMCrIQVM0r4*Hy۴nʁL)tRR1a$Tiej,H4N*L; p SRjD9>T|^*ԇP2yK&^lẬ{8LP}hnڟ.>RRInЗy[^.\{:x1Maar;L̕ y'am7[%v+A Mm6FBR{z02D+򬧩ܟm~F->9h334tjB-EYqT (=w f3l6oե$ Ku|/hƌLwI6\7Gd g]$m (G_&W r=6\2 0]s &S~;w/\, q^|h3o=x #~= hjٗձN{7A|F "ljMmpԆ!Y1WOPsl×b/yyM*3LJs4L;%ydWPPs *..nL2$p@WA15/K#u<2Tp" BXB$"^Z(VBZxCY0汆8a$V_a7y+.,Y>z|$$E+);6h@]a eӓ`&:k(Ǖ|v3a7jE$={>-K8G )Gө]vOn?@S0rbdڐO5Vgy\ec AB"l ߳r$:|ym$eZXw2ss/|:Z>ʯ,&cz~-oZ.^NhOs\+a)5jq{m|339<|7}bTY/Bd5wsޖe3|9$4u3+/)Vbwn?: ٸJ|~g "w9́{);֟.`[aDC F.Yƺ4f"2r2uF0 F Β P\rpˈp18" bQU-Nil%#[kmR%FKcIxON̫kr7܈9FayOݒfAM27O(t9we:qa_&V%ݓYmyK;v&g[$,X.s˩N?:24}Vߙ*f8⶜UU(7kY`@hCwζtm.c]>Y\D%}*W›hqndx͇`e핹Q8Q3b>Oz|d97m8smK xf[KYe_o?/j-OELI5|n:/&Ɨ٧ 'm,dAߺ7kkeĽtyLx%Ch29F[69 !UD<6bitl)5THو1%Q@߬#4D{Z uI,"1I"sj0E4 ER%(Eqb̙5 %ipuZb59@.[ۮչe?Pߙnh} =v|q|x[)<~8I跌XߛbM1&31Hb.`b~+q\Jˏ r&~ƪo`y+m:Bb# 0* c\+)&`: ƻ_q*Ew/7j7iJAO1_1_UNZHr/xr7T 5|0Wx$< >3`j` 𔦘"wI}FGrx;[8_~?oO9ɼg@Hj%ѓL9g=2G,CPQLBX$(f#bcpC9DŽR%TG֠؅FaT0cJR'RHDS]㚽#F)F␪٪&z`qevU@o@)`*=҃J*=҃J*Tr*C3}OrI>'w(bb///~切G}:>g`{3}'Jb| O!)>اb>ا§?)&N| O!>ا%DS=)>اbBS} S0>C3BS} O!)>اw(Qeh0& L#UbS\o\u&.{Q;БAɱSз)p~v᢮hُ5f0¨ zIpU*fr&8R)0dRkbIT3ty"=io+7Eȧ}<,0HAs,lO#w-rV%ݓurvžvxe9s/3oS':l I.Msc(fu]HT}󨫯8lJFqh[\U4"+%f4˜46*fV&͋akJ3^ml͝~gp PefKbB8 (,Yd#VTI:]:Nu6BDQ@IbR:.haoŨ-p-p[YEӯAP.1'C 4EX{RYyheSBfQu2W+.gcZ-Nz]m |s9~ƴ=c\1ó;`ybj(#EPД lo>$XYPBjŹ"g]9XMGzto5wH'>~t0vy:3l9{ _'y(+?O <g*yNY3)BRgpeD`nLEIb,dKdl=@a!32/1QlMQZh pZ <7ctsH2qaN;$gQd@ l@ 7>E8A}}Ynve:.m-Y9Zd[V rD-coߊrɿN[;=G{[cn~p]M`Bg~r p'j#vPy]kp׬_=_ !%@u1ܙL߈38[?>䖖XŅy?Y'Zf )Xk U}3~q6%E.ϧW/_}Y@ɷ`[aʤtL(nzY I\{Ǔ&|Ob2YWcKdm+Pȵ\?}'[T m|Am,@Vjcg`*W rX\_Jn4B8/Z`{~~t\a{$`۬Ə% 컿|8 #`rhU$Tb1.MYlNV9~ϴ:_e9U!狺X 'կ*1B0M`Oz,О>U|~hs?Y7 K[/W߬&J%qJN 'yߦĞ9tE;9wJs;ɺ)Sn͟;u/Y;Ct(rGKp*~U?x_?|?~߮VW{} oVa~oW_oOuIn㏛u]9ۢ >x 2E̘8&j/in`pf&taP7IB7}Q xsDn7(Ucs颐,1uȉ՞2*֣@]C 8Rp7)e݂%gDr[!9wcq@+f\8| P>}xa EDd [\ cHpђ/0ic9G(,mMgx7玫¢VI&*{"@*45JȷL n0ޕn8CKKV#tIrH- LW(Zo[cp4XrUo,/F9T6&&@=ku-1Ɓweu*Fgd >ag/o_Qy u${!l9B&Hu"mDL i0ޕt6]NhϥՌ()D` SqMՁZ9xL+e&B{N8(X%tE}[b6]~xws kȓl!RqIj;.'-4,hޕ?]!X«%^c+\R:Ftu^ NBa߭lFwdevq\G{2zBI J+PY[ ^nR$va8E3 Gwe.\I~z[-%KBD&Ppt9xGK|.nhl#G6ܹX>P`X"9&)q^%)Y.;O;I6Hoe̗d9ItP$^ fqQ_vyySp!+`-e}HkmdsBifPY*}/jwUKϓ=5ן&۸.h2D4pO<_Mk<-kq&?^]=$ϫ~|?o~7Yc4nU} ~yCo" M8w\9DIZkD1vEc%+I 'ї{_h7Iz"Q@f$}vFȘY섌n3>OtrT!6݉&{I;3}y] Zͮ7"ۉUiJ -5Bлw0X:ΗJb+9{a$wO)iB\ 7*[Aޣ,%~sƗkuK`oH'"쒬HJ;ɜV<|G#OwȺ}C(9IBԯ&5/nq<^PN _V .GQbDn): .x-N8fޅ0 xW-ǛD_ !Jy!``y'mce# a a(lx@tZ?l{胠pn2SM[cvbwėȄ1(t5#.a--4Fvb9٨8r+C`LqK7@\u%hҡ qZ̘“azL=\cwĂ_Ǩ#F^x/?]sڄA$~kΖ8g8^a[cP4pG8PfQ@Ks/#hp>MǤQM&qUY'.n"q3,pFF)N@3H:vfwHR4M10<լ&Yׯl|Y]3*kG{mmMv(rMi1 lY +|w=? kp"ڽd'u#;:Yzon! el)9bQAEWrwNP?samU >*,tQL 9(\2`ZWh>Oa7l>^`XШ*]Dct,)b*[u8..!QCTrL xX5 @EA<嘀5 [E>(zo;Il‹&x #iB.'2uq؃iЃoL\ȨϺRl@ۺ _Wv`U];أ,lѡ%}NؐަcSH[Sc!"g[/JYq>d<50jT8,2O5SM'V1INI%i:z-!6׃onYK/ {|qXR?< .6zkpӇ4_`&m(y;l-l4dRVʤ3 9Nk4?pm"7yM^^}g@DG$UxRl#M4#[G8'lTxq*-% XyIo9BMG=d:pj[iR +TUsnQ5!P@o/p{bZF(ԅW`)䦃0Z̐uNS PG=i$l=Z ,aERzR& pO9m1m..G&a&w.y_xY"U<_هV{6u(6D> s-hDZ^qcҢ %ˠ ̹\D-%kYG%J9i\Q\\GqP⃊-ݖ F D~)k(z̧v(8|# tS+TYM.>VJKo֡!&,K: 8ZQ9m<>Qӊ2kGAhFA0 uҀ54VK3l cCn,(cYJ!(unX\V+I.6Sq!WWJ N4#AcpU1o T>ĽQ:Z{'l񄦳*O8e{;a-״oҟ>]>j]3Lo%h5sFI0iG%8:%ܞVmyk? .Li';2ϳ:a:[~M3]G8Z'0y:^ǒ E4^*ڲ &]:0%qU;a7Qn9(u_' LcCc6m4հ^n5G(̛LV? E/e$(Z[XLv)1کh9ˋkʹ]=ξ?qh.'ij?@R/F]Yy886Oo=Ԃ?mE.#2@kcRƈ1v*۪jD1<\aU6 59ަKFnSզnnVdk]\ :j Ŵ} c-l:JKJ~,}H8ڍO7C^x?կ=]@Y y+;( uiB8@^}0@#a *?|;UeY9 5+qOamU1!ǰQ@¹"YzW,.~?{__ ut%O=q[X) $~cj\ZS'Mn`b1V{%xwџ4:@mԦٷ;I^ sПVrqY}6qZ7ȵܐw?oYq'qˮ N V}xRbO.. Wy.!CkEM#DT,2"+,euc*xl6 keZ`7N}gE~#ISZFk{_,/wfއ2ٵ&N>C}}#4w~2ӳ{8D!ĝW=nAɧ-WTT1f 廟O76z|`E׽A]Z|dl2D: J\PȑT,5!Ԫ-M]vL B; 5ʽ.ɅXCw$l^ -Z:jw?\s.ug]%)˴@=dre'[m>u>KN_UHX W\HUxdE@`^cRU;V;jsge~P*2UA)||(w8e+>Ds7 11Ͽ8Xf>e 8 [" B'Rip:u8t6Ի!CټFNҨR4 jO>i|XF!C5yhT4Z((Kwc<_y?OC|RF,¿忏b1[oQWŧ>5yt{i^R]djq<`i\GJdGhFzv4G\XDi͢[CI3$AfL27 )Xp B&6h t|Uz$!jCMk$aX|M?= pnwYCB 5QOG7{--Ό#$j98i5o~w(U8㤯;KX$.$lOa5hKPLd[wuF:#qw.=a 16|geGb{HBTw`}N[%ejy2A!E 64n'cGUdս@ĝ.Ǝs6*WV2Vqb4LhlwIE'cf}@;24S:#g&ti(\8!I`חBV?}d:Aa2 Sȧ_p*dUw^mvȯl07*yP)+k,9õ$5gMs:~{w\D g˖:{{kEfl(S<_ߝq!EוKj(DX]-kBy42ƽSYXAtu`Bu~ud@J:MGPgWak> g;bEۄ{95+g4^4p^==hF7ԻJ2C%kC՞XǪJ$ԍSad+4PoթFDktM"2bFׯqR 2}BI[î ~BޫD\d3ufֵ f~®Q|r.@^`_y.=;fEOhJIIQʮ  / A0ÚuzaClEt%z=\1۸jDN$_( MxZMpklza4CbF$4a=TެI?"%8Vuy.[JD4c%Lc)#V:EsM7_l*d&$ޫJppT,0d,>d4+_`KF/5 /nPҶr7+uчE9)X\$Gw|G"%U}^L+]u XܘhR5tۏnWy)- z1aJ%AQ mmvI4Cj慷\8mt2ҕe8咫\m~߃s~ҫ^DC/FUяEu*#{JF2tdF|*Qe|Io`+f,KrdL9ÉF#dD}lwMuu0; ztl2/Tɡzܙ~Cu >[=p_nu&hw"eXt<=m%f~gzV`Hj4qbNbOGP䏤Y38?4,FeE~r7Z,7KW?ϋ=W^TÇ->~d`~;nPqyx7C* 6ƒ.IX@ُ(èټ"Ӓd6ҵu3j G=lu.b0? Kqc :`R=ƍp \,`ц7ds̔4W E-pjNj^KTKIu7|,w̏Ŵ^65"*DYdt:k"+4(Dt}3aNtKl%҉dIJ;ɜV<bZl[u7Vol^܏Ui%^_Sâ]Z&` ^%4jL`L+n:3ֽDW7Bܖk-8%3/PU>$]3]S <ވJQ Ou ǯ~q<1,Y=;8~J??Κ G&djToAHFLHQeS= [zq"5 m%K Rw7!("eVN 7sM9rɈ^IY]faPPm\1‡5{IYf4H[{`aǺ"&Zo<,xZzMZĵ}~qvc4 'Y99a²>h0Ugۖv Rg"d&g,<&Ţ^d zks20;.-FT/1q*0$Y3ߩ˨ Wh +iKFas+LȄ" <rYtrjn58>R ɨz&nx0.Bn"KA֙A5p$GUƱN q8]J)#MRI(e)DJ>в#9(j!OΎjcQn/f̢vےcTH[%Rj-qX 9l`A4.P^9VjmWwՈڻj0vE6j"Nϑ1s((sOd CItB妣|wW釈C0sF?MvsSQZsRf<$:ϙ \7ևoACAα%Q*_4z6q& | 7QbSjō/hV(;H&ۄ(2L?5t-l[6<0e ɃQDaACyy54 #|VE(66q&ߞf?4fQ*) p-A&(rp*k}BsߦG2&ըA_~>!zmn]v6qOGX-HFS_X;X8>}Q-g?l+R- 6pֻ4`χ5I(E8'sfiq=F|8B/\S=̻zvLo/ ig`:̑+gd{}_n_KpaS]}U|FzPF(gH.3_4xf?`'1N1<7?_ i;OKw7^Bo6wn,ᅆo9ĪLPP9r}Z mnwp2'i@AG"J `s!~M CO==n\Ym k7o8ݙ2]rN__G㿭NcQ_̡\Qܡk%p0QrsAzt^no f:"ǰ$ [%-Y>H{4jAiT7ׅT\p Okj_h%ڮSs!y*E)ߖ)C>!6f=/n58 8 ϘgjG!e5dy-'8j̷7j$ O v+iKD,-Yiiߎ+npD-78F%â$c*Sqp^xk7l ',)vAboK:mT"9b}VЄ"N`sCJwF:@No Sh\F@s`ĜhSF0= zЉI.mJmŖO`^pdjj:EQ 77mW2V )0t%};DFcuMcd@L f>V}C<־~n~y9(зK-KpTgN~i^ߋ׌z]-$m<7,w |wŇ9ۜtV>~ J>+o%g迗3Jb؛oxF:y%n5:RPM~ * ,xy^Eq%%BDVbYPNeȹ\0M\fb9)bHΟwA݀u8N 'OCSxUHY[]lU7nFSlf27q9Q*b*L/{cU'zY#9 /Q, =4yp{\ Л ?!(V0KMst tIo? ;i}dvV /On_ԡBQ;x"at86n 2ɯ %Rߒ|<ڗM.\띧KV(_"啸0PKD9`q>ůߪ :,k#,7%ӚTlw7TQ(ڕ6۳vi[z3rmط%iU@X}R)G^ߌfU,ļ ~ {NLpIGĀ(/s HO3Zg^VW8[r~bp$ϪϪ(|G'w,sM=xpZ;xLj-MV3^_ wJPC] f&d~iI, +J)})_y=M'._й1Oߋ9vnKlW/yԵWXtw4BQ5Lmx;Ѝm rk\bE|i-2P|)XKp WxRqcq? }a :":k?}>kT?(ͥ&6_UC&x<``ʘ\{M_w(ԡ؍(:nj ΄u{q⠅O!$IF#^w)}BgJ$~{;=Wꃣ\/SKhhw)2~q&GL ̎><8kyN HYo`cĘh䫷&'2|;Z.or_f-v }eh OL>> LNѣn2OTG&0YcN nZ oPp.HM"iT![ˤrֹP|]y '9=lu3! L W:=N\ҬCGbrvq_.fO%uE83V\0)8Ӕ+Qw9#B;t$&Gҏ fֲ2SLx̫:% =sёqUR\Y] <&uVJZ,ֶ״< .q~mS ac4c< 1-pΦ TAv#71 eʠ+4Q-E<!X.:C L/ِ5T aC_'g``{ Ub` '*4䠮zjL̝0)RZDd%VS-⏵$HR(QvhE>84%<#?1)NB姫5lTxbFJ1ȀzkF!/NoOWMO+v5SmDNv\nuf0X,aָIYhs&{ԅߩ35SPGbr[P:_,.sM̃?Ә\ Y{a~u\W"S{֞gN>%g#59,8a! ckC /PԂkq߆2G23%ʪlt$&'r(Б-}j5W[pMZ+KQ+x%#H*ݍ' = MM~^] y`(KJ,Q.A$hNbQVd.kc-"5a >O37:ҒcPE 1 ծ{TӘcHЮ]=Sv:li|3`jtHLdN`~?T33:jБIckPg /IElO/' VkUy4MUf_Xb@p}t!1[:Cp]s:;4cPo]4F;Yt L=bkq [p+=2={;24y wUK;q+58Cgb9ΥE nkT$B^:\ bG{E1"Taݜ7A:t%ݜY[0'Ͳ@߄ٺAIbrq9uŘs شˋRBWu=L.%IQS}`T;t$&Go"DZ:b*My5֡#19~5fSF<+>1+zUu7A7g5 rvHIN^Ԏꦆ=_7}v|xC6YHXd|< F]Sz5a>ϾëO|@'L֢kx.;OfynUl\>,94d:o|m/oN0g4t7'7vy:Vէ d:=>wf%aaGH/NJ/ xL1(ǣFj $6r0=(+UR l}р\2x-@ȝ{#@>SG@^0ƨ=&we $^0y]=H;9 ܼ@L>0,ef–uVIpK*| |~ 1G=Ho2x  $ॡ6o+ k5::(sutYs`b$P`-SG@^.-&e #?YJpK4^ |]ߘH /sj̗c$Pׂ<|+3Q,P6:f`OV8K0]z#$P7^k2]&JYoɐuCW@^b.Qp8xʹORyLKn۽m8k_*pe,rResqn(;6i2&`ע`(<7SHK~o΃_-fu^C@fg l3+./ ȿ_0z='v 5'mo?n7}q΋\m-\-3 ;YURue\rVd>No+RB@4lmy!uRn¼i כ?}'koyۓ}sU'cXm,||0ϝ,N ܙk p ۷nCZk||d^m՛ jxQQmx۳N}da"fM\OҺK*d>ܜ7ˠ:\׹uBkY kk]pa)JZn6s&Su|}n_gdnr:z?[OB{ghi?_f6c̃LZ,d'a>ihZ_L[|-Mjx!ٷ./ܼv7ٽr7[=,3< |]6XKM1srIֹ;|\e/>l݄ܵ&te풰yC_`ho֐a#7wAҖ:}~P>:UMǯ'w~e;ud΢r^epv6 wѯ'ճB_m_Ȋ(NOUΛj4%8qW?͛ qW;7`"N1d9S:bB}%w~XQ)˚eEm}Uq!W?_{#mq} >Z+k c1QyS2uQ-8|窲_x'\m/%Xt>p[!w!r;z#}͖dFNx._ C;adeZ|ܶ50>k70 kmǟ=c+ [Z z+qH_ƆƃwM $yA4"USD͈G䜦#R}*M8wy9L 'VV vHy\7˼\3m:1ȵxj״ kf}IyG,*Pu.>v` N=f!e4N$dҫyV囄L)& &VRnR1vy0g:41Vכ&xoR3p&u10bs#j[X8]Kţey >1P(]b1_A*KTH)034S8 =q<.kNo8]!㎚g)[ L]$aFϐK &5qE3 0EYXQ O3k{`D-~zxm u4=JvaK՞hƴ5*6|NA43Hai2ljaVĴ<A9I,`o-$3iwưN. =^g^e!M$ie,WL362t@k݀w!ƎY}mLA)Q/v8Б'> +-W KcNmfIM˫޹PT}ˑ3&a&Xs%  $HAXϹqn f|v%Q*8LFE@3 g.U8WK%D)2vg !|3`2YBKMXĔ2c5%$X.mט!-9fLbV~+mg.YsBP7LQ>sd`vm/{Z,eCΠ]NzX~L%5`vVom|h{4-ώ @ k9"UR&bHИS&8!7wYKQ{<^hb:2?ɾq{1k F`*}32g@VBZ0 VV֝Gv9ix,<6k V$Uc= H 629/r #q0 `^uk0pGgr,wȎC~NET֌OJ VCJl>E8Z &ti̮35Gn6:. ʁD*xN&ީzo۫yj5މ(@JI t9@l,~I(rV ^fx؞bV&kO8x#%Q_a $̐a$t 8/f#l3}iRANYfT aw P&:T.F99hmًLˇ/[lǛA<dw@iёހeCGghya?O &$?Mv>d2hlD*<__J~y\+S]or~ŋ=LuޛkP%V1^ٻ$قbQ`I^~ _X=JZVdGL޽Mx<$94y G #h#˃H_58Za l#Xm?l\dW^9NdI r6 ҪQ(:xf$bwWcؿjyz6ՆN}`AOOTlHk;\wQi};ְE]vieM4ޥ&ݻTk7 saOY/\g ,T.Skv}v=9=͓\otzntOs$Tn<уpl7a_rʚWʔ*(]=;fNj9㼍Aݛ?ݩӤ.lH Z7l4Fcq;e9tnfw;ҝ wka^Bx:9 > )%bx,]GϳRd_檟(C9Gy<^yWK  0V[ש0wB{UKpXUrvnDG3ݥ G]56jt+rxOך-4[H0r-6eWt7[Ҥg!׼Ȱ,Tt` (Tc')5Ɵƾ09|5_fFI B|U$|\q)) C" ThG[`-1LwY1SuǛx#a?E6ìVrq5O;&GTCdb㮆ȵy */(.|Lj%aH-C"|lb4T9 F۰v2;p{HdC̡8.%/I6}q^^=vATp*ɷ.˨;Q۶MdJ 'w0NaW~0ΒÛ^ Kᇮ4jN޾>0X #0X #8$FAqk4$S& e IE S(DQDst8VK*[R ΄ [L0/b8YeϜE ^%!rI+agI[8i~*u6dM{ysJr!dfBD ﹵@ dZi.yFkڲff5enx1_cq("bxkxMƹrP:VY?<2eD*`34iVjrz;uNmCB _CoB3ah`:n\61).AiSS: Z9BƭhF[qFo8I_W`6IqbRU 5UwϓZM:t0WC<@L ̄b$> C˴ ^)T:[9m唌 y+- Kpɬ)p̓!>xi*JSNNeUEhT=lW?6ix$y*inUa#=~| fV+0de&[ X+^C*%=Gxo-"Wj+5J }RC_Tm B)}g6wB&z(JڅQVhnfǁ}^wh;8r,z'5/"([L \j)Q%  [E"X'bF1a( #ΤVZM:b;"x ;ťBƀFjO'ċG,8"2yS'"-21)MY?Ty4^P3 < %so*"^|9pY7z 9]p2FqqED$FiKZyBYUmL;1 &N"!d`j W0y2>'> _>K[T`]$s0ǯ:ůa`ߞ7UiDz#m7^|k^ȼI;hQ;U)>vR|63%[XM] iqc!Ɖw໨LS`)"s$C.[ ~+t9l(PK85iat|2q9Y0TX.Fj%\2"3t毖7u;~m 4Br7nZou- |ѵQz~30:Qo<-iVO[ڪ}ꃣvc@(]4owy:T_&Q1_Lc'к$Z[k꯫F#ea9a0`Ň|WuvTkwr]kug>בT1e8rP cFp19CRWn?TL|>lΡ^g9~Mwޞ99D||v<Ώ0!ں&ps~߀ww6ZtWCKi]bJUg(ʞZ\qR/+~~Jnò_ןWMSuAzjkFZyUʄ(rb ;TP@nOM:>.0 ž"V ̄( ƒ:AQ9cV;dhU%Jo"w`-3ZdIk C5erF:(@%Ҏ8it>iP5&.6G;O>nFS8 \RrskkJ $QK-9+R 3~̖a+{5HPքR`bf )xCmմw;*1onM] C Dك/&`\edjWCxI/LoZ? iQ7&Ip@%៏LB4GS|p\;-i1",Uuk\ސهٙ֒+0Zd}u|p.NG]U.bSim Fx*O:޶3]:ag?w ܦbvv7 \ 1-!1(r@!\dJsҀ /,6l}{nek &}43If6ɼqP]wo 9`_K`hi)1{[gp~U&m\KL$ZtA$gX ,tT#HIgA{LE*L<i %Xz))qJ7D bQ ƑQc &A5(|[ Q^LmM16`8Op1͘|7o l>C zܛh3| "9)밑&}1N$!YyQz;$ݥ/hf kBl88PU |7{ZI9Ci¤5i5IQI)%3ZX&#kw17*KuAZ|ɧi_9uΗS7_o(=|6# -L""V p*FvTc%t-bj#iyY=KuhT5uyVe [7-4;wK6EU}sޮ\~Y~l1U5U[7ef5Ww0SPs6'ͭ Մ^m$T8pYY|*5X ªEhD+?ݞm|!vZXHw`-.*V"aH>IA)eзx =]> E-6`M@ZX._ c0n]w6ۉ;KxB,Nc #W.sSD5m?_-' tړD%e'qu̲T9b_tm^ vèꔈZfit^V[uO?y[}pt1<{xn h >Oǖ꫱]$*Iu o10!&!kHfYXN@i} +4*%/>.=N蜦zm]뼓j]W%X:N\GRȥG_CFp1^Csǭ~8}Xw0p ?9:{s>=;zsr:;xa&#uMh&^l@麅1*r»u.;]ӥ$` ? 8GpG]fրzHTP߼jXL jԫT]^S/7Bȫz]zli:/z|p*z=sdwO ^$*pb%A^xP^Z'6J0g Ѫ~CXP©&/{an]tnhihʨ:iu2z 2TSF5A*vIN#:R֘L<mv<ʍNL +pUΐ *G.7-vku}>zi9_#?ʃ?[ZHR+TJegA{գݲIk+Z5q!#TnD F9{ajRVOpv'4S9DWwf:׻VnDZrm~$eoE6Yk,MvHƎn ۭ?\irTE|@yFN+S\@ }rk6-|'{p%-ľgiri6j,]՞u򵩢Kߊ!X=5HS !I3Yӄd319 1i`Rta;ݽ;mXy0kNsi~4OZ+iIgm4q->.~ҟ+Y)?o =8$λdoOwͲ5h)3ؼ䞙C0Vrۃy$}8lFvo}3;t7/cˆ~av6t8lpwam=6i=L_[\N/bN'=x V& A}35>]_~.RW{;]_,N}v)%7 p,>;V~ƌOoԮްaD7~ >|8Avvt8`׼]v|~ߖ3+X*P_߷T/߿۩tewٯ9Xma?88|b-Rƶu_ƗؖV;^lњ|)FjۚRSlXV5'G~'m(2Opf5?-#t|kyyc9 1MoȾȷK)V1ZbЪva]L.ErVu:;n.vY%<pr?\nsdqO.&S/>W+oo?- s|Z+$4Yjh7y5(ԏLgK#d.;1\P&I"Lx.ԒURrX{I\i-?0ٚE#YxpoEK1he)ZVX *cdS6}?蛶hiI31FѢkҗr>{;bK1璌uZy*qd6Iq<]R-F$Lܥl>1،q 8)J# U $W0fdK˵4Y1TG7o}ʒ&S`ʈ)5 "{ xBʕAh)rl^;qV~BmZ-60HS%[bnŀE F+NkO8u 5p;g8l(*'5&JB};һd.OVF E`ڥai ,qFθеoLV)iL+TRT]E\5d*AO U쑔YlXJKIiJP6CKC.R $MLc`1)6΍U\ ,U"T֤(`.C&~Wh4J!\`}`+Xmd2z[c튖5||%]T +Ԝ q3L ,B 7$n Eq:X%4bD` ʾLJ> 쌩*Qe65d&v/ uR A2d*3R`Hqps${!I{GRH6YO"@N)AzcyPW.!{hbkfR<"[p gYnUp=f/nv݋ v}ƬQDr|]1 BTXʝBLȂ{06ugy>m4ʰm˃vbܦOëWP4'`Ӧtd鳒lA鲸҂X\p1'8$;IBB'؅syPKPi$!2 ! /|B2T+ g8DhvS?_\)$wK&+xF4&|vmփ^Ls  AwAK |B Fl]JS v 0FO%8Eh D;,JuX`:ex / lw ,u"OY#ZM6-C`0>)Y>n8Xd~*AdiU(M.Ge~Ѓ\@ qvqPDt Ww0aRBM|L0]5,Q?ANX,e[IʸϐNڳ9(Hh(,xƐ6T 5}魨"Tm,e$In%,F \-y7`uTL!.pOm8XqP0â1"IHr0b!^Fh `.#@16ij̣%p1\7Zq`z"=)!ڨ&JbL.ߚmHp"=j\OwvsVq v~9(SŊrR]xQJP"TE t{\@@YKA%Ph#Q@8 jYtr聮t, zԃO ̡Ϸu%XؓxГQcP|L3)[|x jiTYtD.56 d:zAsr "CV]bM`\ V Vt|D] !މh2bR\rP ~8%I!ʪSAx`Aڅ2 7; z\K[@dYi@xezw6;_}qUVPT] X2.w, J`aRs X:(f@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" HEI%$ȕjg@@~$Poു{҄ZH D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$"@QKH XArwjz$Pi4" d tB$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D/.w FCAA, r䭶 1C$g2D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" t[K5;- riu]^ӂ`!8n0LϘ]L p)+Ȯ2j0T}.]zCtU&ؚQW@;sZmWWJcP]+KLXx&W]QWZ]]e*Q]}AJ9Sl;kbq7j;tRWweB];+S/  5zlA*-ASDd8:|38ω/ӌZ\?4~p!JjbE-3¥2H2kg~?(~3qBv\4u5ҸWCogM[~gϊaNL.2okYZaIiM95i/;^gV![uXzY-'^\G5f#/3c~Z~σiǰμҸ&YpSpZs.,Fx8oLj̒8]SO[gŋf]נ~̲-][hcTbג⸸A[i4Ik]'@<.R|*ntj+h=f79KE7êh`Rx_e5ctrbϽ 4Kh,~?K6jI ߉XZ-EaN&\);X>Kc4!23nUEjM:}s::hqz挫 /qդ.+(V]3t4NB{ptpSc{-py}nxikCZC]bx-%BI!{"]e7kvx.{mʬHc:精NuI՚.u P3lR)0\{O`Uer|SVo>[˻7hkid2Zx `Cdrhfjhf*[,X?)b+ ))+ eoTrjU>RW\NԮLSF⨮P]IΘڥ`~AuU;]ej-vuT:TW{faZz][:뙊r;.NZg5#썧NV2Qy,1i!)FIjnarߑ%IE>sqHIV\ K`L%;+rj:S)1G5)!=MLuj^e)TWFLYWR Ai8(?<; R_2w1紎K3JȳVA҉`UɌpėyV ~{u#Y'I|Ռ|& x*oLE,|˳:8I01|af>U,HX}37Z;#駇Nϥw4G.nf~P|&.׈.ak!uˋ ȋ~:iez4S뷲w0+<^n~t޿Sznᢇ]?9zny37 _?3*i{rP뺩8jd) Ű7O[7pdͷc3/[b O2\GAc-"oĴN?3O%=uQ(.>Kz yr] zcX0W|}IL6>aOYj %JlrVqZw u꛹y3x?~?[6.rīrs~6nV*#<`4ci> <9% W򫛜9FGMGkIl7/~ØrƗp9>ڏ/0ն5׫U4nM0UҌ¨> Q6}w 1TW =vh]=ubymC )2_Tnשӱu4in<.ƕN5OIr?pS#ץN`̏no8s\t:g--zUrIm "Q2_y*ÑXyI4-ʩD:RKhRȅ ̻|-I壶D>19|?ݡFΗx-'sHn8w|pu]_!^KPeJz orp|Q;BK_h>?:Ji0"R1.ڐ|)W(XT)`%B5\[A]c-P ` Lj]jMU?b]GxwE}]'% Wڻ@_ UiTE QYob5TU4x#LTFޢD 6r3Fm⡒U2d'i`jNVn 2"9-WLesh9Q_zqC7?|S9J~?@jr-b$28dn?* LO4A2Usu!_qZFiq-̹|t!Ụ̉)( r.aLy8tq:χ (ԷoHr,C.K*/u `떒+(UtNn |녷YҠ99bC,pߨ5M/\dht9k?J򦸩(q v G}̍jiqeLӷb֩ߚ'?_NΦj9wz{wo[Ib0|9' ~s[AHWOt$λaX;i +X1 ~6e+>?=^;MV nz^Yl]vh潎v@(/)=Ű/}Pn6@&4".@uw?ȿ_=ϗ|ū'?y`qLH$>/WP90W5k9?-~|>|2<țk@ފX 8 p> &:򐙹N~^F4T߬kn:]otS1X_c69~o{ao8o'34oČz?XQߕص]#X=ds>jĞI,퍢Bϩ4#dA19fJFQg0H 3RۘqnhgZbQ2[u֕ߙ 䞛:jǍ1w>hFy:'KU/pMl`A;:ŷ箉ĕDY;ި[8NSڙ5V<:ZZuHk)k(w5]t.iZWa/ .!(ɕ;Ԛ?OG(JnT% '1 VFBɓX ]"KG=D/ɄT ,\Jٶnx^EIa*I)b*$V[f| _YA 9ވ8b,|lh-M!*+L|hhmN*!z' .alӥH¢erLR9g*ip3ufzqô:q3/队yOs>R -[52G,{nR[5ZvY=l7#y(slaQ dlU g2A%R aOza1z͊zWOʣ;U)wz+cU}rT]0,/pU}J\E E lî7b~QˣW)a7 WLUv z]89ihvbwv*M=Es Ge梑@EpFSD leg+no"ʪ80wk;VW l Waj3i@IIG6+7նC/ zWU/LC/LZ'XJ8E\rGp#Ul_J{\IiWo5VR #V p @%{WU`ӟ`wMw*sW|WoH xjAcqz/ؐݨ4w\湐j\]^fvtMUڣ 6Ϛ1|mL+Y uQ;T 24Vh` |ÿeGiTKW&w&_&냉HpAB %( v:J@3&IH]"᝔U&ܲvP껛T?kU/AF )5_ZII)Zo|}/E!CD[ 5o';BpL]ִ2N%ϔ`_hۡNSu:܆u:6Xߐ2 8^߬{|:ևͫ*öPykuخ>d2x2 :Ა^Abp.iaS]bur)e [7V+-_^]uz<LPLYG}`G?બZ -V%>ܠF /0OzZ&C#JV1 >@# :nPK/ٷoQj c XJ .CRP7QdlP 1$(E>*s(b+,d)2HgB0 &h)]:y۷(tL>8nelx3aͼ~9w =A?7|liIp||J<F|nc]j; fmۤ.ܙȭp rGXzI} 9GE1y2Z[JɬK2t)d2 VYJ|WIĺv}Cɲ/՘v: Hɡ΄(RD1]tT|‹BtDz(sL" %h.d@F' }}; H#n痋’O/ƪ9I7/Q!wyD]}}[׵EhuR:HD*ʻRbh9I2մb5b<"z@B1p,aVsS^vv X)K[R.JxfԐUFZS :.GY/6F ɊB P1N(#XCV;ձu&Κv՟ZE:JQ}:&Au!@lFH8}`2@e1'd/k8*ٖϚRhe)EY=PΨV<&:moC:8ڞH'RgzՃ '>3-+" J<)2D"RhP"6oge ,+rփwۻac؇ă ,YO@m Ŋ~ES1"<1$H^(L^jgd"R̀)"ںlw7 x'b|ȱb!K25S6&+("+:QD% t$ytgdW_::Ӑmtl-:A&ub4,>_Ҵ4Ko%=Agh]2%5zn0&6Nk$/xc=E)#1/oAóo-eF? kdN5-:IX$zrΙb4W'tJ^,7Ɂy=:] @>)O; v{>y:Ь>0ۛoxrCN;:;usv4׽v K!VXhm.*FիQuurpѿ?_v۳z,4:?$O?59]RŖU8OcF{?N<hϪWAҞGF/9 wiV௣o5?+ TNɹJIOd?/'['5[-g% }"ӁL<-^"6sV5(\m|ݯ}ӕO9<C2-WnעS?o-$u?r~GMuԴp۴y7}Qģ)QdQlDGuOnw\ >{w/?}|GL7㇟^f`\ʡ ܅6;;PosFSK/ݑK\j@2ES+u*N?{p~"MX;-ʄ7_s}74UlOg 6R9]-;Q:YjTם K-O<7I$J`&EI CCΐG (TzS6I}0.- JIGG- M^JX'S\ C5eΐF:xJqbbiUtT6x&Qۘv'Ҡ;S*K_Ҝ12+d2jT2Bov۞43j=@HBƘcxx`gmI WZrV9AfxIy%92}"Kaޣ(!T{bxרҲ<Pb.y`),%DR"XJK`)FR"X R"XJK`),%DR"XJK`),%DR"XJK`),KD Kxq /.%^\ R1F1 NɥjX)nq],10 fbrbqb#VU0Tk 颫Cz 8$*8ʝ7\ LŘ8P=x:FY^1Tk%ig툳R3@\KI V2[_'_5LmAziGyhE\,gx ÷ۓ:F F1G}&I#nK{.]&UKM]s9&ju |Uab~%L֯LSA?`mÈ:b ՜Yk6W߯vz&2H5=PRU_ٗnlvO}I Syc}+B!P.Ts3ӳʙP {]y_\&97F]zevNFFp2-Iv O/sr;gEwBp7^ Z$\;uE_?C?Ծ`OL@[km}tZ|^2߉u9z/R*NAP>2||cI0ͤ.֎TӃ)G~?0j='9/giٞ}N`uv`ng6$bHdm/%Ear{Q=Kq4ǐ p&}Z`0faV^>6")F4_$c˳qPް]/ZU_m{0Av5of}P_vFHisOYy7f˳-#M|E=Cd"X6fj;f!9 ]<nņlۏa5Em`vsP6:.K׻5h'p\ne20/\]䆞|w}ebnZ\_Ui 47VQ$,^|Pޔ77,.Ra W-5]:I鎡qO߁2T*[l &$HH7U\{3^nX+'^s[`1ceCdY[ǝI2#" K%=DAr4 ̼/0nj<Fsv?}TXoQ&=Œ3y*]Q>,&zvDhMJ 'ia%a7Z!vH/Կ_L"c#N~+?ч ~9[{> >ݺ;gnbۭs) Ga<úg:\X{d ݹzuIɏӰP `cLY[.¬vRi27ܻ7מ5S_q2THי4rz:uBv&qrM\jSQz9eU'Mn  U~wu򯭗^!,璪4&)#̨w h* N\@Q}Q}uQ}EQ}D5G53HEaGf5Qsb!x rK&TV:i! G+T4hiB$ Lc,-g!)fLH\ݹ;6q,nܠ_zƹz{9xW^֪򴹡 80&a&R &Jl,rZI?+sgɽ>; (P<`"FE 40*j|tx2He{7(*I׀02tIKD$2;HΑ}BJ(2flِj ج&] @hbJ[k&HIzD]c1 i* ?Jk%/, Or %?p3c# #"8%12ApH1;@lPˍ{m"UIo'w͙((E."SQ"Q$ 3l9MI"D9]#8Fa|yب Mrcݾ=x4&aui3 s'0ɏ{Na3GToe` ̬1!.Fs] { ĔuH+ 3N ;]C:`Gmal$6ۀ=@H#6`H {:eAF(@ #a!]e=nX] -ܑB;1^buwj~>rpHqSY2,;`*3lQf *,s2J*1G bW ATV0 zǻ^8eЇiʃ}F6pӫbC7 }n6v`yf䏲Țf|7/#;!1(rEer3 Q$RWxg1+  U cJ@Hm_jRz\Tknlfh;CeԢ)1D:WKڷ]aZuoӯݑ!:8 >&e{7+JaM;ݴqnbkr6nOѨgB>+dW'O9tt tDfbohaQNVW_lp"gkp ,% }GYeD[L IDRBG9D;px(Kp3"O{d 13vDðq$38֝aJ@Q,1mriW>lL3"\f@zܛh3| "5 9;eIC:bTSa,;Eom]f:zӰ`{Nr0x ]4۲9Tƹ7k0Űv+ga\}EV0#(bFJ$dM0&f:x4RL<Թ?u.? ?%g΂Qkn#ui} l2D./_>m;(垯e du?IҌ0A#P_t:o$R]6[?1sfk +(npma亱'[ˋܗ^o E-ok,?Y{֜]Zʶ !-ƈw/ƉFaz})G sR 5T(o+L Ēu~,X>cXXET[m3#"> 4ZҺ@zE9DŽRn&x9ɨ#t2(RfX+RMf+H>֝P΢3\;;^2/9J6;#Ol`Owߣ==Y s !Ӛ2w/rDY!*`^1fbe1@hT8e-z[J(.YX]'lyTT(idJT'@Gza BV{@~4sNa䐤2{HM vCP q`:-R)&+̐.[wt^׵8Z8֑υ c# yfhDW$jE1yƄuR7SޔjZFeaGbGgrugzS) 0Ai*4Ega#&X:A EE=!)yMRMGSj@ǫfjj+o$cgo#l~KE5P2㺸M )np0sH[]D bePsU5/VKky>T՘ ZGK!zƌƁZ1MWLȩJ自vj^+(߾iNNdNatܳ~[)Yn5w)SKksNn&kdUz+4JObaeWsdRh,XLƶX.OLǟÏp?.?5w@Gc X\HHubmwJǎr{9DVӱ@C%, [J*h&G+_<Z tӡ΢d[ީ^#gi=qPzfm@5(ٱ4[> {ghJ.^X>gJ_Vr1ӯSi#Y'U>yyz"µ$; UJPY.1^@ 3E览 *,Ѧ`Rki"u+̗5MR254~:yDFp,@v$tx]޹Urr]~c ir&kSr-7pUAzIP7Uq4R:%qL=EiDЯb*|}m} H^ܥ.5|H+_˅}iq̞5zwxѶ4]˪ ϛK罤_˞ҙ<5y5~Xo +_>M -݂k}9ۛIsmk"XV@48&m&Q?fU'v\y4Qq>o5R Niv#zMnze<(Iki]mc>~ˍn1Yj|fg <y<1:>7G7DZͨvGP\;ۣ+g7Z΂|VBzYt";~r'Ry[Rw0w8\n AHRVa S띱Vc&ye4zl5ͭtxF!sV;GhqtxCs  <* s]`$S{']xmx.=!= |냭|p=yq>5^JF( dJH1 N9ťoW12#XP1g&kעj]g ۮ;j8ߏ}dU42a2'ZK(Ir yIEt$LEt!"'U2`!e"88f`2 LYGf9AQN&{ Ҙ68EeుǮǴ @ QzHLdu4(%LR#1si`<'h9pi^sXHamկXo;Mk,OI늽V-c?4,7]dD>fm~O& wb0+d _|P{4Y=] ѿ 7G=;,# L DF$#U"^,>(bvG{*-b.6_,!x\*F-CRƐAc8m:?aHD$bA;CwOwu9JX#zmtxl^ptI6ާVr ~o^??s9alZ!:8'b/ջ??s3bt=ޜ֔7pU=᣹Q:|xm2N[|LzֳP˲`\"ƵK9kB毕!pI' ٥a] ߿Uݏ#" 췳6Lj!\O)q.?6!,nWydQX_;*#Jbc;P+=w+GwK ^(6x!92@YvqVhn{k7p Ki\G?V/Fw~*׫VZ;Zxl:GX`ÏU2$[:fiv8[oavry*Xw= C.xi=7Tf;Z|d4`J ^lVAPa@SuGC+v+E33'7~orqq[^t-82Y#(91F C;ͣ 众1MV5$tpDBJAY&D4rfb?n2مӠ(d`^K:[v+p[~w's匜To_v9MtJ3G{qyYߞji3 ()5UN!H° q R &Jl,rZIs[ͯ>$Qy*zLF`3(*AMt$RjDs$r@-Ykig%_}6+͡ owKMXĔ8M "y i1ci* ?Jk%XMx ?J@hό(ഗ֖ {GBC+㽣,(S5S-&oOv@t"2SQ"Q$4f4r'D]iGpL7UjfwY}3Cm̪E_duU=6ZI|Lej`r ,Ck4W¼HZizZD8 eo 0_kG1_" ؀$b#1t8 ʧ귈Wx . !ʨEXc:/];Z;!2w <)Vg&V#Euv]]>Dϐ[^͐+ JHr.(Uڵl"z W9v7BF b./hSX8S<'CڎƩ%ϼfQ~ו/4k8x'$E3b8B.bf! M"E9i /,j0 U c*x"lp6HF/g)Oc.*5X@UeӮ7T.㙦ЁڅUAǭHʲG4㞲nXa>/]2*:uĥ#?67S@1z}*@H܄EkF.H J-l$ 6;˜,s,#vi{ge3*>g3JbQa"L@:z?߆sh/e H~܆hK hhۀR wCH_鞠T؉iEoFϓ,WW*jvzlKBĚq"ya&0+do(0Qla% ,YebɮO,dĂ.ZdiI`1ztzTDF\KʥDi5D9DŽRn&x96YqGkZ ?2D#i9v[ >n#|=MDu՝o^(޴[ ϥL]o{|*V;ǧylDu޹J/&;Zdxp#^7$˹װ19jסv~s"9P o2 `L!f `5@{Rϴ 2&(aq W@dAk4Y*8yĐZv R|Dɦ8KЯ<٩Wgܽ=k\dzYV[ÏXFH.{8'yhcT0/Ge5; p d /rr hWԉrd|cOgt$y鼞AԖEo ?s-!h u;PdahAķ~/8 vh|ٶؿ9azKL{'{Q>LC-,.ЌUЄEwKCY1~7tII6PZG0lHR}LM()$ďoa[_J6[ލl dq+yV3+c_'"X * R`Dj s> 3$j-_ٽ ?Bq2n33_Z_nQ H9] o8+Bpݽ;h;vv.0wD[Gv-;mf1eٱ%cq6EwCr]dfw񼖷wn?(T#+w^I0F;)YW65'/ʛǧRV-Kו5|FQNuY6Jz~3_ǔ0[Õ?*y0#ƨw)2{o{ u_FXl-R" u+e \scp$ȦSG>X#&5pG:7"y|^TVh0O#2Aa3JAUwFh NzM+pDO(bЧWn~ͧ~)~6*7]b-_4&Wo1I䥣39_ܡM_0ֵTS%Wtf6(OAEGbt6o3N:WJVNkum_Y[ͬMl#ahb\ճzsf{s+JFWV*@&>%i3 `.^~]`\ݏ߃p`[GG[K{^o$ sS㜵~~}>^&qܞC,rЛS犝"U y %g*ֲeÐ~D$ihjMӝA]^K*!]s&$ BLVۑ, LF|⋍"X#nyuw$̃$b_LbKpD24`f&*֨Pԭ#S©)Hzنʺv3墋v^cEzX=oFt]j[2MY7Dn.h%5 $ ]#CI0i(FscMllPKD#:-tlGDCg20рZO34 0zw~6K;ftX5ۑ($jrZQw@Ee>K\3X+C dr1]ņW'/R< HL &^|ϝaZW1TU `'|&?SW?-Q+<`ץ%/Q*"qRWKՠDLʼnճsc0:T$96aQJr [E 諡yIeӃ3Zm̋i$2kcuVB~}DfK.=8Uܜ{ӑÖ}td$3zeAA,>1 d* D42āT \<(0 E_<ﯬWs 5FA$Y_lM_G6G=[9v٫bl>їt |qDjրG_t2]tu^T% */WgLnc1Uևեd s1]gɍxd:mjpV T:p9~jJOjhY!Uq1WIWbh7>88+2]goɏ:{7 <^&LvOǧ`UnG. o_܎ڀ lGet29eQ̵.ʀ,w Ԩrq6(w%09$Gh`w4,֗-\ {-U=/86nޯx"q^Q|6=R5P T+67$ ;7⸧Y]6i"^y -dt=Ʋ9E9jد]o'vUX{mIуCƕ? H|=;ΖׄFB0F$11AYH*ʈ1QhFAA1R׭YZ2($)弣>k; ]׳UQV#݆u&FaA cRсVqhqXJBT.D|9rA/E]_Q;Z^Jv&i g6{Cbr}8Sv CIZ%.n7gK@ntysj3t7]:k3s}To24B/s?<`c)4]v0uy,hg1uU 3diYNJJ̷g,/:5"{uv+ԮasOKvaNVѸd-M,*nYY5Hr&{5I׷& @ vLB>(*IAoszZ ƴ%6A5Yw|OeuѥdMi=톆|UU募ſ-E='f)Pʵkf#sX*3,b Ƹ!`w7JѸxf\7̂f;Q!o4=mU^\݅'PP@[H+GpC?*'hj1‡~ӑVH+#2W@h̕#7bKrTq2W\QGGdx9s͕ }a1oh̕#Wc1WZu P))|JcBW@0#h̕#c1WZ\BȓzJ ֍fU)~y{;o,- ŀ0D?#}x7MpsS~} He$zJ@׏ `e4E{Np^02Lhx{ :KQj;l~1$_ hh>TBņq_FٱTp R5R<3t,Pq(.C˜6z2y٧d<< KwFm\B3[p}Y3P1$4[24]#nv[ 4Ȼ[, *F>4VW:} tb5QD;@\Ԍg>|s^<]{<{6t_x7M}{f2݆Z!1M Ϭ =);Z,Сǝ@%iS|NAyHԸ{[.OM_/.mZ 1 y9wvae3c )Tzna9!M>ЂWԵEiOeVƫ/n=:;m1RT}/HZ 6Xwj6*KuWbj{V-=.lX%=&x+4iwʌ!D 5$qYN5yln&6c 4Ql1H}ɌmNcMYw*MCpTntX/ogzL# a*P) &1ĸ_uT)=) 0}4M-bSk[ ݊Q(v .-T]hj  .d%J,wǧT\jt'2^=(wD(&H pJ~,(T:pTӮ"žȄF{*"mAQ(%>CEbdzt㉗Z"{:Jtt-eܨ)4m h6?LKvLMBME1._g^gaDh6Chri^ٹDm5\w[ޯ XWλqX[:!$ZYœ #L;27KlF WAvw+򎵽 7&XPl! Fz&5Ʌ]_sb[CW_N}H稫bbr}xQ CW0Fٻ6$Ww7w=*jf.KB{ Ն$Ak*_׈\7 RoZ`pp3oQ}ē5^;{!?CD=١~Լ6Y>SM-5jѣh f=mhBRΩTW`y[d| ywJM>ro\WQ\V&P3MR4< f#*otRG̓ՒZ\/ݢ父=UExeY0{,FMf^._wT{,!.2K4Ⴇ#$a`j8tk.W٥M_I0]I/h=\ ʞ b*y{\]&MV:,8"{Rx9ZWz6΍=vJ=2m($DzZ nBJ1au3/!i[d8+{̈́"ˋ:n^Hiqm17mK`Ӥf ,bŹlˊ!75:3u- Q93hO"W璑\RD=#/QI)#fƭ- ey5Me/m]vR*q=@F{"!0,A!7T3lχ&Vaݚbhh~,.s^M;gi_ORu6QO%bZ/C@ ; FD$E&j*B'/LQͥڴw*mSIOߌܷrw}}ըpxZaxyf `.m=b 5"P;G+ ]6j_P[ʡ"پѝl"f.E@_E/eZ#Y+3QM_sqL)nf,|FINXx.~هY|gHwb䕃]I8Q! ~.Ntcôtc؟l8ur ~(n&Y疼- PZyZY 1k 3LLS=(. UǸ3\(ka~,S<,S5F!0EX0FqՁZDPHP@xm[GieJyWAYBwBQ7{W؋!6ӠB&I>J8}jQ予ϚabR9sr/GQh,fJq VAYz}^ _+n[^oG7ۡ/|ud<C-y-~ۧ-(*u$,kOT!LsgLK| (cٌa,  6q 0#05CL;8pەD@~c$P#M=uP"]!\OtBI8u^T>9ZA۠FHî#)N5aOLeOLN4/S9^CGsqN@ fS 泸RpgNj C?LL]H)S\Xſ5˲ؗ^ :s%g{Zr)oEʳZ|[3:la*JFl㬩V.GGctƤ0,x¤rtw?*M@j &Q N+;OKViICS`Q,,XFA L8%iftp) RԶ JB QLZ"$"AJpDsTB@Ѵu9 YY!Ënb-5Y iDTX'Ka5fHK0(vQhs ~+?S ?L@hό(Y.ȠE(`\y$,D"4X|2F-.MKhMB]!_bf 337#I__RY]DӻCfml-`F~"lcй(hA?%{9~ ;QIoSv%tH̛foπ.'1*Ld_Ւ3h/){x3qjhW[ ^R>sKLt~:Afv᧤/aݚ~FXt0)gLimw2NZ-6nf[D0|hZ`fP8tjN= `\'j;xmTZ&T}0 X/t Y mWwaPI,E e? L. |v9jsO`EP*X/v5q{ip \17pcVuڅQ~Z dU [Mm ڪ{롱=!'BZTDw]T| PX(@,dsk,QV8d>A] ~JWA˜EiWJוcURqIsG.5mda Xd` F^dewj:|H-"m2EŪ_UwW]!,#|q4տl0ngnp0G'o_m?ɫO0Q'^p`Z&i=\EG[ y-3psc\/__Ap|Z&"|A"|g_A7]Jlfn0׿lF$n٣bbnGVr,Y*&\t{IʵU4qlg :$*pb%A^xP^Z'6J0g Ѫk,4Mu(Tz;ImYfY1Z9NZcLɻ<)ژ5A*vIEFuZ1x")xt0݂Kٿ$Cu+`xgV&rfCu+Dneʾn3[ ȟ33S*Ƙctr>l"Ijy$"kY"Mٓq8ϻ=gqw 38[eر+ܓ(gx.acDYaFގ߷P-}P*Y|>=`Tj8H|N!+{URo*oդR5.ONG:[ #z!Uxd!R&R/5eDDL ` Xy$RD W.X}?چBS0dpY?ˁޝo;W:ǪM-ʏ}l;Z~ј;yoh4BkM̨3hR :d[mL}2O#{o"W\[Qw3fjFڱ9K˳+\m._&-Iᒲg_J_Zm@cq\Z-hc却]m$$'-p:6_Mi^JEɛ7./Neq|8>=MUKuu'ӏ>Yo"YC_1\_{sJέC]ҙ vq֠J7k>siPtfVhH][v%'O QalY7޴Rod =!"}*lX{))Jnhb΂wDðq I| 8x*Be8},tw{ojA$eXMrxFy:+<Pz,whfbO1yֽЫ=+Vks4>c(eĴB7vTzQR{70D646aN׍nv7b>iod宅{Kw֧g?b7do4$m hcZP>PTO'{f>Hgvg"nchhbVwA= _E%&(H|[t\?޲,TZM}=}՜͡qpz~dt@*)oCb@/؄+:m[/>)q {7w"s.Bcp8*_HJ;LI7_".W(DgsK/UXCştТ=Տ̂wYA{~G_-T!r:6/Ah,O7F**)K9JXV\oNC(+D:sL-&Xt {S.[^mL#iyw8JB8O~@'@=eX՞8LmCQ{0rHR\iㅑJmcT q쀝)&8#Rs7M9[Kg^G :R$@qlD3 `p05)Zq@~Lk߄ GSRR{+0U\Ei8¢5!'GLx`]8gWǸ}Q[T#2\*밊ƥJSXfJ"JFc&=j@uǤzWn4(\"4@‚'k@HOyo2BuM:6 UҖ"kxI[ּ>p jΕrtrm "g4`+$T΂#"ᯠWoCU(p=lq XbUH R"AX4G#ς< lg۞ɹXM/[\}_s`zc/w.>$o3| '885Ҥ{!lS1/avꕷ[t>GS: +v.7yM|{-oLڙQaۛ]ʫک▱$Nptx LqY̙C:<N`!V;:O,JnFԅ&&VQR1qV3bLWLȩJR[ ])qOh̤GHQy 2+ٶ6{/Xl3$fTD{0{n<)˓TEXjcWJCAAM55yydX1KV{ߍA?> 欸]:{jvTrpOL Z?fIԳrB;;xQc#P"`O/ NBxp6 ^EVsߊs/mCm:Uo,'׬kirgݬum2*zq11iDY:ZA+|1GHcm|1b`'Y ேz)1!͗"Wn|%fX朩SI28gL'# C&%,|>@P8MSV>&)ٓ)DYrfKgt#y%`D,\/fi^]%XZη'XLG eZ30{-#chnD4Vd-;可>sϲ&H Q7/ qfA%*Ű[)rMiY|Fɷ`pXק7;-VvygDz2P1˕7wVOz\ZڔPG Vp).}Ԗx93]{*C]H]jW,zYܼQnN.w7z^4n祖a2ov"o -]QϺ;:&GO|Ҩ$l&Z]hӪ;p?%oo>MdCߜ"|!,Ԟ ݏ\{H~jCBR#ծCBHH衯K퐹JSse+*QYUR\=CsE$t\%錹JjsԪ]O^ۛD%fzJEeA 1s+*Q[oJiob /o"u\䪮D-!m7WJzs H.,\|Ȯ.hբ [BPpPK$',Cc7�~MӪn,s2<*$B@w=]2@3fՌuLsőm7@%^fi u]Fk 3vu16eᒎ_(4/{󁽽10{@Izf-ט" #|[.jI͋48?OEZB57/7JWp%h\Lr8L0˧aBz Ka^zj/9vւ3|%gS4ao%B /tp xm%r bJ`RGSJmΌ6>bCCM"Xx4\ݙ89PI=D%ǽG =) %7Uz4 oy͛/4,Y R> b:vTu&?;~>=s-[꿈oMB7W<>d LUs~1,낤Q%>|]/8ZUrd =Iqۧs8Os>ߊrW/KbHg假te{ј2!s՝U7 Wj%m&R'Dt@(e`Oa%KaEw60.T C!fJc68R$x[Wk/ĽJx->cknY=M_,4^v* ;3LvvfXj*C9 Wz4 ٿ?Zp*6*rrϙ,7`/sd҆39Ȗ<04h`WYp)#l3r< \l,֑`Xr-jmE!5NCj3&T.*89-e?W/_}9Dv^Vy*s[fq%\ޚmſ[ZϿ̰`!P@){#3qZi﵏L/*eZ" Ǹ9XpѵuRT'x偩\D3`N݈YLbA klLuf /H <0F'N-Sdj A舙 D%VPUC馣IՑXݎTG14vٻ޶,U3;Ю0,̤A=,&05%Gg}_iQ-p qdTU,^zOw@QC-M9B TY*rA!Ɖf/҈SrE DR1hne,(KcG)qqc\FpZǔ ]$Z MD7?*6f7ȯE\HPf f'UXCÄD dPPJEB09`E= $Ϝ-Mܻ`t|f' ]݊ubVlמH۝JNI =ݿ ',sF,wrH# be}0deDDL ` Xy$?n|V|eF]9G4+m!nug:h=$^}1VYE 62Fm6*GT ] :H2z M- I,]NjՐݞ8IٸQVż V-ݚ` ٧x7iC~[m=6k%^N^Oո^\]%i3sp1} yuI ׾ӷvFT:yyȭiuq)J`fs77yKO T^MK!MNzg",ȲoZ9IE*L<4SDc+Q"!%K9 Q Ƒ&&)AVs^]9'0w]*uU3JՁ6O6^O^;U:d.=gV1UaUŅrvh !c0pDž1 $D)r"h-2iRbaDpG4u+&LIX8FҎYW0i-AX/5^#*葧Sӌڮ<+xqa9&$dfu,,UY<]jJE eG3;rcc(P F#Z=/gNs(8sI`Z MFV,Ɣu;TD26̰4)##V$pdlc1'T)˭l;c3kg<Ҁ#ך*L1`'`(qd0pQو-Sܖ0R1reh< I*1KZyB-#5HUb6CcHZ"fbຌ,!+CE 9y5nA037z5sX;|>ҥJLH0sm)YURXI /ESgŐLQ(K)D0[tv%33ۻ?Aޜ{@TiC(XK@׭.Xt=[]^eECOBI~[ E^O+˥|RFp@)G@*_Y 9z>{a2=q # /0H-?ul)'#~+o oWUܘo~uz NNe2ηBU#IT9GTfX> V0b~~Уݛxi+G%hӇYb6(mU)9_CBg;Il3{AjQv*&(:׽rwoz.?ͻ˷7q+0.G;Oy3a/-:L#N3=1EAܐ6 @Zw+0 dewԙ&O\]̃_ƛM3tO]Aqj+ʸZ9\()/># ߀ Wߥw5싍.^$^DkfYYxAyi (1D+BV%JoZ#ކtr~DFKCu$4x 2TSFGCB H;N#:U!X:%GfXܨ[[@(č]vʸ$ +vT(WF-U>M8ƻp_0TiϬEŹj|7Go4t ph S&߭..%3-c1+Y\5`g1tfY$!V::d_}r 9{pGz]MLryå O=cF@(?(R;S?kvɋ\G*Y}_lӤs)oC'Ea@סC2N!wu:߆Ee|wՕYwGs_Ms_sZIq4v7AR*:5UXߊg :VL3ϖ7(fF T r] BA=1ze˴HxOxC@K-]N _'mSׯ>fqa)pp34a&'aWg06]nU|f< c|fQYY37;. d/%6 9* &,(:-.궅n],lY2poeC)uP\ʿ:Wίȝ0}QӻB"wƓaT ϝ{1)RuR.]%=?5vlgM1%WWV^=hAR|͠_&'㴈I*p<#7I?r0]NaH ~@V‘2wtcnω'J3Y$$dEf!-R˙f*8uK<0s|#$VU}Wwܿ@zobfIҷ<-z/qoͤϋ2 ezyBf/˰*`?!1cXf2&(aqDޱ6kޓ!XQa2aiN K@۝t%H6^V:.(4^^c2(:o;cƌL"^ˈiDk45[!-}޸:t[߮mE: IM\,^vGE38?$*Ű[)rMq{ީ秡*vY7M7;ۯrkٜ"'^q< |nl:54j^ݟh=“i`7P$qlD`OxfhS-CLMDqEV\(6!H)))3Ai*4Ega#&zc 4 kp@^N(E81XD"TWS5St"e4o+AP|'\f>;Zw܁Ts>_ݗ*yqgXIH퉞84)>m\v{uh>i:|\y*RMA`n + 6 ^G(ˎyBjSF="{;Gr< ƒFO rݫLեuw&|6!٭I@hsשjtӀ.&6ΠuuV=,gY-vjB ZNwiy9HTE%<أX@wm e;R?_\ǻvcqre )R!)"V%(iDRvd uӮvCXf}BE{?:8%E{-;me[֞'CZ$ϰqm/qTC :XA Ӛ#"vֱ$' ٣ U3gjw) ͧx wZsϓ麳^'U:3A{AdDYCGP::jPɢQ[Q`Oy`w`0` 0IA$: Ha,dF$4ćÜ6Z9V FdrRGΣqQ;CaLR8DnSo:[zHf@3/5_$ aї_N"qkRF(U YRؤ,$[eIIEr;Q(Bu\8*QHu ",#.܆9 `F[;f$Yg+9HEƎJ¥, uJK"'嬳l)g\Y3Q6MT}(9 #2*.]fd^(p b$1>)QDKZtbǁRB($ Jm(E,FZ<ȓ,P3'Cl37A%GKat$ĄbCL:3m!,?,֕nq\^Y%֩a5doߍq@)XCT=k-"g By6H[w(⟫<>lm!(#0)1%j=BTQ! I Eu(s0.)^PPKXgG9%ϐq2sH(&r6Q-,&3Osq^pī\٩3c3 //_%/7v2aWҏ|8 k*̓WO934bGU)U[ ik;S}T-]kfpY.r*YUg}?Ƌ֛ߦx %hUﭒJ?L϶򷙓0[\Pua-uRm/5~}OMΡYݭڔDh0n,z%ܢ*W7U8X}A:hx򨚢~獡[=?"dzB-c|-ćIDYRm5vFjk'A*+W]ū®UagV@cRQJI:ɘP0+%/z|v]g8 Z>04Anwzm+_'+0{CZsݼԩG׋j|,ҹ:wnYow朖<ݡ(y>J(y>J(y>J؛ұ P~ͨ)CspΡ9T8 P*[8Ρ9T8 P*CspΡ9T8 P*CE P*CspΡ9T|}q6BQH< G!($ģxfi 0.)?HqRu)H9w5X##T3bM3Ucѓ堿Ui$/e=*zzIٟ JlQDUg,V LЬ.Vٴ^>A5!ƆtTU׿ܬݛ&uCߎͯTpiwDw۠!k>V݁5u['PIa_⨆@uibW;XMv@@,+ڵI9۱){#=ٛV+}}ы ^?N ~b\Np9vB#ٕS5]{Զm_=+/>GYZyZS|g42(P-v4RSL1.5.'ZZ9vUg#f_/xtWu,'2ܨZG5ó-!ڧtYÓ֠xZ}iJ(bUa`lW]?~|?WK)+G<^m=z.¹diL4ŮKJlu)/8'jF*?>?zcY{#;vŧyW|wŧyW|+zZ')[ǩl&:w"sԜ-%JpV[?!TMzFT-tf`z_&MfS . tz4ÅC"N#f@&3➅hV"kC @E{! 3`g{R2ݽ7gWs!M?Kijo!QmdB%KIJJF:8NEBh H s6& ;Cj"ʀ6   udW2|fֆ ˵7KT#@!^ x.E"cA;DCH/uP8[XPg #)aJ <(TY)XHڀF+JJD(Qr%@ *w_yl&T6_l޽6d4KXxIuE-h "`# rG\E%,/elߕjFRxG2iKD# qOQ"H CH)L{1f^JH1O#ȎB2T8# (IuVR0!`|,py/p9I84 Dc+z[=EiM/QOhr`TZOW ?6;o]NΦ^-ەAjdʜϽA>}͢oD`8rҜ7~B֕ڒ@.|]1lm1,Ow(!p2yoq2Y#וZM|g2 8}>q1=հ=[/KM|kޠ]Nǿrto^;_ޜ~)e:};84YyUAGW+ 4ۜ/Mrw}oxˌ9=W*̾ pޞ'ƕbm즱wNeVuqj{99k]3yyw1J( #TQ5 yVQ Qq&<up&1' *nĬHcFxI7yI{ST)ƄܢP%ON2Yu8>LG$;W{534fO17+r?;KF8h3‚gÜ 0g so9SRunx5ѨpuUf/eƽ2 w5X 8/X\$)fouxԓˆ{ 䁕@dG޶ Wuz6|27듕{>Xv(UYH&ir7ɟ璶N1-stgzt>WP4s' .'k|bd<&mkd*&vNQㅿ[Asw++e3UVE+HE6F_r1!#=;=ճ7s`r6KL^J6,5;JY٬m&u zoz}VIP=ONd:Tt99m_H-zQK}R%(m&  LBkJ9 C]DOJ@D)"9 ɔqd%QfX"5J_ NTJG!9AQ*b+$tЉaDQX ` 6 K3rW# NWWkԓn{md:{>y @ N[l%;}}٠u@5D4A 2"/+@M{_>~<48ۊwb1"UTM Gp dEAVCL$$֖QHq#(.yHF^::7Nȉ@%"3%0YxEU"g=^7⸾8w=(y]J"}RZYqB(9Y`AH_Fe8tf; M-=9L)h! %TOQ.K1RVRvoB}*1WƆ(lv/ ,{7G2BN`њ}q/* St\zM*֫s?u̾FdH:ג:@УPKAu]R!@(tO:6Ӟ=E~o[Fm^@[^^ h@HQ^W/㯟C(YƂ e/ 7Z-;]FnD KL̺IPv:YB& L]b1$ v06q|կ}֭-wk2+d"Nd\y  c"%q`f,i薻cܥ.*'By l/1}KȬGүwƶElށ3uoN4>Mh pLE`|.'6!r \F@T@e}] : ,@Q}9'B90Fߦ+ Jz'&gV-`92Cp( ,3E0w*hPw֡a%*L8S_Z%YZi@(jdm, JX$n;E&01ѬŬ=F3RIkIR0lx3r֫h>.厫_m;[ҙ{ݒ˛2S2EouIJɛ_8\:ڊLmM\l<,d:Uv֔JPߵHlr* |N&'-B(GÿW>h7/Czs[WKlOoki3a'wxs1Y_\\|r3؀iU+ 1 C@.Hm)t`]Ar.1vaҰ1X%6;SA £̆)e3ڨײRQ "clBÞ- :61g65+mLTE )DKYgݖ|,5ɤ [=~xO4nzlǣɿ~1+&X:sUXUVʾJ%\Cs e\1hU%XU@U\GsšBMv1^*zLv]臅 |OA!#Pr/?|7_L&&GoD~P+e@1-skP#2LD}4f{ME|tl.Xw{hʺ*9Lxfg*\m$C%=Y1*=qq78o xmW$F,$#u(7 G:E}$dLt2j™hL`*qG:|,d((ԙ<t*U( UhbeH\ Ub8vc!? Q3s›䴒zRh,A$Fx>A# `!iA0BȲ'l'l5H9];PꚜNE'$x$ɔC46EɜM"-…*+:-PIj[k) ]R`0wQj+bVX(e<늲(^g9lӏ7*q2Z8!ĐJZ;tSz ?it=*bX4Yȏ Pɢ8m,d B`$s4 !LI^JJB{\aVtђ%Jo>S7g42c՞*X`Տ[qlOUFvq( b"'aYPrxv&IzquZ%fo%L7rQRl{5SϼSwwȕMom\zdu{&bi^G_G7ѫgv^NLz?f󰚛<êOf~~Y #Wj|+><&2QG9|s}k/m˶t}r _aj.Vf^b)t} `ÀWWcOUS3MyAؑ( 5dgx\h9@'|oo ?m+Z]3L- ]BE'G8lQ q@ІS(1%)L"c^i%h.dC4 aHA}Lj3r66Gpܖ.Dٿ|ɶm%<6S6_yO5Wlox~{X]omV,剪Sh^y*HJYd"LNHF% 4/Ƴke@hAs;܆Q:Z2;AD_hiYb1jP.U[NXEzWfcNCA@9d*V]Osfѧ%eY:ZOo8]!;cmZ=Sk\XUbHŋ$ihY+IKdbӋ+]tͷ^7 eHQD;:mt=.NcbNҴCrܓvܓHԌyBlrd6LGSbKGpٻ6$WyCIyD^n`z1`1<"%%my}#(%RdQ2mʊȈB^bP.Ĕ.P) jRl[H+ȕV@LEWANxrD+:mU;y [&D~5}:޺8v܁9E*H ?"%q[r qURPw0Rʕ`025ni\w3 ֔g%#Ϳ98̯ZX#x_v(ZTj ߋjsiVŵA_P :>`Dbis4X̯d!^ik-H]6ɻ!^hGpj[6$FWO]j]?6k!>yokւiiY{޷Usi)Ȼpӂ9_hn*U"DpK+ ^(t3[{(5"Rϰb*5"H5NqcLy8\Xd(1g4`VctDIk602`܆>d3N mb:2}f]klX :X]]_~=ݼ!:ܧW/S">vc`pߪU7ɘ_ Kkp}W. 3 Kc?JP \6EJ{p-:29citk<([t*z]e1B'ʑ9%x%j)j~WWBܫ}Ief,,aCϒ/ԽI']Mwv~#dg~2r0 P~qP .ΊB\¹=|iLz|zÐBoT $2W1nb_foNq!ې 7.nGg }&Egw>k{[Ҝ}RX+:J uTYetrsr dcJ`0S+w)ZyAR \(E}mM1, }紹%[y_Fe.V߭-/b3EtGjy_wRwB2;pYw WN(Y(CL<7^mbJ$6l`oNdKֶ Wdz:huIsN[ wn#׿}t;EPr2홦#:5[k4l8Zsc3x4-4g_jͪ5,tzXs36KVh&vTߢǁ}d9NXɼ.TZ0օ6IE_ӕ?;}lP^u;ޖ'-F_\wn/sL!"Z -I}$H!!΂vˢMb&oju[DVu\a몣N_J.T{_RSptZRax@T]F]IUA8_PI?%gZCB컃A#=͢`2:㠭Y7)ENK$ZcE]>IX#`JqҀG!XV*"n.2A'M,hdnFzJ#*{}l3MbIࣃR ~O"XI{O#WÖWy}xBy}oA_^ | Մw%OYQ)lD4`PqJ7}{݃;5]uD>yŮGs8g+jmkѣPCZdjAȤD@s /̪էC5K -`]୷lԉv @HKvϏ~f[  ߧuy'T*cB&227. HDx13Sl$x>fDHx *XeS Ed`0uCTtpnH"Г H>v=Uk"=!N)փ܄Ae<+M9z{hU A`I""0#V>!3ua0v\iGXL*b :3iFjB#2 (lr RMHoӽEZƬ2,lT^ /Yp!pK0l!\7yuPj!# D5 HǛ_q<<^-o4Y[v!H^׽?wß k .ӸI{46Oeng =?pNY{|C68=uT]q&#ebEXn}3eqOxr1.<ÛjNnN.bϤ>_)_Ƿȅ?}|[8c,vHPlg9=J@\K+7Q,ⷳz SA^}'G8Ef,_1i&SS6`^k|y4/RBP6bFs=ҌL 3Weץ52g툳9@\͍m3t *[;CrRrwXtQzzN֭C]>S4~PT AKʎZڹ +.:ptuЮ+tI-dPl9@fa%y"r;>uQzM G* l!\} L; [kB_GYVƕAFr5]TAvEd y6LqKmW4VBq$sMB]D/?%"HnJZ \1ƖBZ#gYS&U}[z,"K/E;_[3 X2B<9kLwfPU^~mMN)J][䂴F[qHs>mXŔwD2-m=X'_nGq<5/d@˯hIO+3$ AHQ&j.L؞Y8s̢m%$,+m| Vq MƔ(ѐ0%dQ 6jX2RVEhoC:d1g9cTT p \)-F]W#KI2b$N)i/Z kZvNڐ*jqG>2͋:b=.0YTj$L5i-_, H4UsZY99xȋ/c+U#gGߩmWtlx*-q!L Gǂ1:YLZb9' ܑuID.ehu+2"3҈I{qeGZ,h%U-oEF{ JW~=Ғ/qڡU7HYxIpBfO$yi#ťFeEvVi$+/trčNP9Z)c4=C5YҨ$JSXT <1-Nd IһNƎIƮqR+fY{ 7Sm uvK9]iB09I.j4DY08~09sW?Ӭ 1gJ")e1vL =2 mYrp,R,:l,l('#j8t}q.B$/K&eC[]~\'}ucoy[:}Q^p2'8@/?*m6/ K(znpgx|ػ R#%;W))))))7TT^P {Au/-zսTP {Ar/Hɽ $^ {A{|V44i=PV%dr 9_&dؠJƝEv2* yDzYXp ~/H&* 4_8pZ/wt ~/w&^p ~ᔂ_0 ~/wZ/@8fקPuc< (3SPh[iA1a44c; >QFIf8b'KL Y0г+-Ol-xMt=ad5ٌ[^ugެn솮-}m]]l:EO )xviuϭyt':]YU0Yw-+lSmM͝7[rzFkkny5UF-l:qb4 };Д^.fmm S_ny4_/9{3xjϭ{eMзgC⾟, ѩ_*\Igus'AD*ιYR`m ⶿)` iWNsH\kJ=tji.*`? 萑}2?JU: Da7+)T_.)[xmdiqͩ|Eչ` TBr51:Rk:\sWs}3չE69 qR8`0xA @5GY*𷓞8(xKR/]K\M.TON EH)8㔷. !x /u[Ʌsa=76p*)c |&O5Z| )aJtFʻR<1m#}j+rm5WwB|LJ VSۓ&. BY2_b<, g*f V1$2dwv;qIM\"q-(7. !mBkC'X DMH\Ux&YfsjG#u o&

QV:H/[4(tgb`T1dIKaƒoc%%s+j9qp㨘Fu("hĬdGRmHym%c1Kq& (ȧa j쨔!\zϒЀ\Ǩ.rb :#gK>7fB[xP0[,#sFdT\ A̼P$k7Q|$I][YBNyRh?01uQ$ JYhhEfC"OZZ~:8寋_뛒@P $Q)7AU6ې1N4-b%ҍPu&Xo9E.-T V)XCT3k-"g B^68p[wm6*3LΖȡ1bEqH,Q(tzQErD"GWHa-ʨCXcUfTPsIVVC`n!?kC"g %dO&ݟ {ځ?|%4u2hWAVN-"JVRsD ^1x#bV@c2'$t1!`VK,n2%c+W陀ԩi3=E^݃;tD 0֜j^PWk֣f\7ߛZ/z^,~t^lr,~#Nrpjk"k rЅwP**$TF ֨a`E1|-a'J)i;m#vڝuVI~2T{2eu_=sd]&tz%)GS`Z*Tm#JiWY.3C ;pVR Hg]j@3k [LВ2+oiRkb8 %p) bTmAL&ʌdXGhk1:#g2cg}u9q}@!'Ps_W ~6 P|fKܹ PwsDQDEfꬁ>BZED ę|HĜ򆊄52J1hښK3J1&*yuDgl7P)*I2^-&j:馞s/~P FggY ޱsk6w1*F3כ_- ]I>=9Ow tyD|}Tj`_ܶ["_~.wD:$/Z?  v~6~ $ 9H >P1@CujR d%Τ Q娀PԨ1cdCI6-/41I~Ϩ-NȊ jZm^ދj:6a;mzY)2ѓz@:nxTZR֙.b}=jfաv^lZ(̊1oGb_6'`QKi.-&' `Ĩ-Ϟ.1' 嶳D(&73(wl6:;Bιz]Rj+,RiyRgηW >hW&~ ͫR&W>ɧY©㥟h|,51;'2ڗ_QxgDgkϡQIvKm;,*97CLQzQ|V+[,N!:;F.Z⽇8iz;+|u۹zObZk | Z64Nz/~_y'͝_}Qe}3=z~7wgcP] Ew=^GW(ǽFݿZu~2( 'ǵPՅ}@2%BI嘱D*(b, D8P\ NQj!De@AK"zQ !h 6(U"#D$rO=ˏ…{qP]r7f{m?#MC ^"H`QluI9#`v()\}Y/ O@|>>T&F`\t\c+jt`\SZ[FzJž$RhdG2T8#  $ihpbWg9DKj:Dq'޿@fH~|Eߚ _L [nnv)H?Ϧu;@ȪY9GX5 [9!j0ˋ2mQ-p,f->ۉ܌ٿON?}5_p:H$=?%vs~!smfh'"n)J3#C/_Aq\̓RoX9-y3;"U&p^] Uq(HqdRK~Wɒpvdr3<!?+4_oqipKwzs2uy덱wM!Rl0vM6ߪ'nvGg$Ȍ3MJb d@g0ޙW,& +@PfIdt:&K, &ᛨg B%(q=\>Y #"Tl0x@UBaUˈ>S]Ujkru0hr/QcַœF\r"xZҀ W5ݩzVS(1o*^[Xa`Q5A }&dhLW(_YSR2DPbI@,ZFOޤs-p;8֕\_,7aƟJ2|tfOh-t?I?~^bv 3ZzWih'DyCσk_좵TعZz\T?e\a#aSak}6贷tXq ٠t8a%+/bWk#+Ejd gɫ =eh{hRuh˺I?]Q PbD6˻<My\~a@R{=6or+6=ٳs~|e!쭮}]F{Zܹꮉ~qyo>zŎc߳߯+ܰys /Iz%pF*e;C4i֞S}o^Cq6V\mQWG`xAVW;Lb4ѡ6rHkSoEJU`iɢdDe0s?>gYtsNϙ"J)D HUHWhlEֽIet)+V&TIGtvQ,UU Rmc<ܻNn3 uq_kY_Qdo,(CihpΖEEȽjԳ/ɹwOo536{x#wi)ngj;Ns EMxh„cі>b)2;#bmil!*1w4QQXj1^*>L 6F)5BkrKQIdag,TB5£M?Uwۆv[gjD4lـ&\]]Lt}`K*P@ TMel2 O'/>7%"u]j9%vӊru+clhcCNKdADVpʖe dp0˵v&X6.lT$5cy;)ȊP֨/JETFlizjUJL5 8mഁӎigaG`eXmکU?n ;s !;Ĭ B˻w)E3͆;Di5uJ%|'k{;Vwj\+z"zT$D.B"ZKΫT 2O:#RH,d=AJNMIOH* $L$EZ&( tгvi̚L_{ڑѼ 4 *6@'aP9 YZT6't+/dpJ @DH+6dQRH%"O RRD,F%kæ N ں+!ͣs$b&B)M҆uKs{1 #\$TnЋv@som-7 νfoc\4,cq 栚 4F *{MC !tV[sL~^Bɰ:룄6e(48PuTВ쉁GFi("HR-X*s ftXD6JPjֱQlV,o? 5ݮty=lE$d7q{H/,~Cw?_Ml_EAG[kJCA9~Byw,glgEi0oP!5Ob _[s̑)ҏvH{{κks%S6dc'U1T09CãU 9rTG?OΏ-SOݧ.n.mrօ܄ M^2'N_G_G7яO9lt9quvBE>|7xB;̼2qz~dŧ{y){'21zP;^ᕻ~$/۠OSs̳ |IϧHT}OPh5A7?nThEƛ˛ieIx_mWc}fMΩM Qw4IDqqqqL( (S\R(hY"\DYzUmh ݆+%(CR>dEh" SS @$wlha];-:[vbx7̛#poӷA8[ω֟SO^6vpQŹtz}pQWCJe׽p..'ۯoڣ$Ç.+A-̼(2Tow^=7NU=r^SɚBk+D"2KTœZohM$SQBdƷ֔%'j]2'7EF03a#2U:\;[vu.J3nf}?Ǐzf}7SzsЅCa~JlԴY *.k#^կ0ڌ#LOE$ 5hK` 2&K/r+_=*_CX9+:# V=HQgY9D"CLݖd5Y(s"ASqYcv ,dRڨ12J"uIC"}WrYxLVrHSQ&\?Q=TB eFd.iiC2PѡllHAEP83m^ɏ:KWK(/^t d$ƥN*Is u\/ Ʉ̋ n>^(suĬDc>l9;YW~6k 1BR))@c Ƀ0*koH % oDp*('y t e*k2a$K"ɓպF[O rf!%Y3@ΠV^ݠ' @IEf9tY( Jp.T 'f 2jU:"g3XW B jt:!dz{j])EѮ\!h!RD&XN&_C&) r`$W dQ1[~dZ7g(-eŎ]g!.aPKwj!md3砶e!5[fgOo/ʉ[:::B`.ŘI~ NyMU /QJ74q(d9h 63/`rӸ=v_k i6+yIG@Y+ԃA{ 6AM{f":C2]%|EyTM|їwy Пaǎݏ71ĢңCBQ'gڜU eSmϝ!*2Kޕ뿢mX a$ yL"\madcɞ[ˑɲLY6MZ*4ʌ) n|L.@PU3"q 8FΖo o -V#}z}3ۛ}9=Bj!6vz1R=kBU/'P䔔} knQFH4G2o˛A$fJBݓ rsV%˹q: 8=9z0995 J8Xj,oA%,;Z qo*O1̽ޯ=Cyf5P*_E*_G0V*jQV*-[*+UʾZΏޔMꁰJew֤Ļ*u|el4y :K)cD.Ya\ Ľz3Sts5ܗ]5b/存=y|hBfzc`R~,$W1~;Ѽyn?%շ..g4JzF,m]RaxD\_'F/&5k0xAEJaLOJu%:ZW`"XR'W[E-<Wtd|_69 "RHma4iAGRDby3#HB:Є #4,`ym/Z|bG0fc3FsI"#AHIe?ĽV["=O4]9`zn۰?,Zabd^22AzI tiz?Oǡ2M[݌˚b NɹVBpNd.d2%xyFmKKDX.1ԁ{\\%m̻x})B>\_ FNNk*vDo+AF I㥾IjM[qJ_Mt+*F{̆#ۆ9{n>ĐT*\߆<_Vدk$?>-wBͤڙP>?i`4Ӭ2ˏ$1ZD+XVX~Гs/ 7g];+ͳ.nrݬkhNFFZHX&KN*8'|p5O.jVnEĿfgo|?|_~ݗ|_8/ӗ,h);H$s{2nVσ8y8 %~j |a1Eq!.gA_bh72^]L{C4t~MIDQ/8f L&jpgD^)ttцviZ[DisHQrǜEp Ӂl7ult+eO옲Q_#e׍lt>ki6kMgpUt%ޯIP!bfzs=lԒ ,6VKl1dNIF/ 9h~}$vho=/SCA*S?BVGӏkԱ##jޏPɱGV*#QWZi Z+Kɗ#RWDv4ꪐ{<^ vuU4WWP]!ʴ,<O\Mgt/W8\N۴xxFEdM-sKvRIvwEí9&7 0Gh E|ߠ튶P)z7#*ZYVBBσQ9+7óBn#pgL _?}vqDgpp~ YRm$i#RDht!WcQDbWӅJn{5մ"4z N,+ƉA4ɳؠήqqI&}mg3]*S~s5aU$s,,fl %a4XQC@ nohSʕIB$1 !\{҂\z>\ӇRr0WNԘT) ^HdOZ9LP 9AMc0*FI#1V'K_*4m~j-iY# :ȥf1o<峌V_}ʾx+7]_~n媱8iIF.X^8@ ȩ!|㝇eU`m{{}@H-x7&ktxœ^# &LZ-6>@Hj\HA邰.E%Fg31+D^:2xLd+F˱ZKo&ęͱb|z_sBx<D.CV)67 r^@5,@S⯪ݮ!k8vZ=U؉P0jccQNpc9lв(boc2:OMU(h9*#RWSGݦ(NNe64kHHFG'{< UY `iqs5uKX~rGEYZ+LKFF4H/x!IB_w!^4>3Ta8^iAʑY2<"X"JUfҴPF3$Fqk"y/Hui=1N׆YT24h޵#Eȗma6`8lfaXYrܲ+Zl%YV ~uS$]U*VO\3E<2'oOh}\T ?'F^~ f&u=9_ ),5>]|>t_+*^479 ZVj5^vauBhၽ|[{d})gQS`)"'bNL]s;9 OHWO%P0.:mqkru+0x1]6tDxV( }b|3ts\F4-frb׷REp@Q}jCh0a_??T-=NĿf dz78r ?}>??b>ۏ v,j؝tM'OO෵h̜[fi)0OcnX \ÿ4g(VNGB:$i=U\E<z!G*|?݁Gw%_#8b#Я g)fQ KF !Z5BV%Joކt!\"hihBq:iu2ac{ 2TSFr 1xkT"툓FC,:BF?.G;GwS /&~IJl*c̱jzy1ԗ0`gmI WZrV9Afx 12AFE 8{^w2["f^XPl994kh $ %`"{&CDT )RR(Ŝ(aHI|a 9][=u z;p[[Zb}7gwv55xur/>ԅ/×]8sg,y8ƼgY3Vagfq/'Z1uWm'm&-5<' ?bv2.tap&H>I{."Vmq$%Tp۟u2fm+XzsAuvX>^3v;Yjwh67ytNk҄چ{+;;mA:6~;m{LQ)LTtǑY3|;nlx4U64IѾvվڕqtƳM8kqU1o&I ˞e%^8p.%s \XL\;O@ Hje:,sY&x<\kWYhcGج66?l֮զ˪^߳RjP{Qf<%'_a)1 ݫ[˿W΀$;;35p!@eFD8ehѸԣ638&hS'+*h~~]X,-\6ۻ7"RZ1c_ L8h_ v7\ EԦMrQ)e5DS A-qx!Tt~5t2~2`F bPW_ؗ>$weZz>$?׳ xgw ,ѦbRkiJKM( =M@ 4g YPJEB0`KQLj"hPRxQiVsr 8!ȹQͰ vd SY(\$J"RHDcns2t6 ;ȸNVX92c/r:.`Yߖ|@LmEvxq=S)zۗBT |d&VYt *E/{^9Nsy%ؑp$HoqY I T9c.IRh)͌҃a䐤y Fj*e;Q T)B'"xA#87JOluq.&ig«9iXGj<"0(p晡lN&""Q+.Џ+/M R8?J&[*JPt9=bJsq%pO{BC_cLq$\@KeVѸTL`wLtȀlH%1QMKc5P9g3_1kp@^i,,P ,pbAaJ>ؒZFхcaVߏk/Uն֢|jgPqFbouӕW_lg`рmP VS ߂澜E?~-wI'?ꕱ 0g$ZtAeEQi("~G&=&A(YwA&^Nuκs!󀽅pNm|G5?F=[0 @ci=t;¼'ޥ[<4HGryw;Wž9}* z4EtULլEa RTUԈJ掁'0wq/i▱'Y|10 fbrH8XTyB t LLryå O=cF@)l&\^1H%?M+N2rU"{,,ځ1ͦb`4IJ|= % +uag_qG $.{͇xpc+F1gO6~Ԃi՝s$t7# il˦;|LcjB5C>U&߶v%s5Vse޴_pCeY˲5NѵzMLG eZ30{-#hFs+%"[ Ըq1}߿ 4@s*,u`Nj-4(KcG)tnXb7$Cs&H")vGEwo> HB9UaaR"2ff A@li_5MhM_Ͳɱ#7a(iբ׷hrڸpF⦧uv)Rm.Tj"BhE)D8j(F.鍌Ҋh%pqZy*h-J|=)Q[M*Ffd j0DR%clhVK9,FBY" ^`+?J8ƻ9tTw/AeFѰ[DPTΉWPlPKm]$QzI& Z K`gvkVO qU&)Bf~V L\L:ȩu5k$vU;WA1k}me=/\Oɚw7n~ {o؛p>N_S[M.mM=_.uq܊m~ \`P/&|/9%g䌗3^rDUO!e/3%g|Ux/%Cd !^2Kx/%Cd !^da/%Cd B@`wPDi,gJŝRVO4P)i*` :ijýx:y"ǮgH"K>/V8|_H"{ YCIFj;<^j'>5!Gc&V+CwD>㭐w}ˆ)?" wkSPrQbwGKoqY)Eb'@=eX՞8Lq(joFI*N/Tj4vCP q`9-RLp7?}&t8z])b2%ۣt}$vT5dI=zru麲P緷]=Ětk6NN-tPj<޵#EȷmX|Yv]܇`GbKv9~C,,% NnYMWE°"؟EHpI;(iC?SY&&ŏrVRlZ5T1RL&2lB^dXV:#g=!H/)#)0]=TNbB,1d!Yɲ:R RfBW!Tn}&WsuZD"hb$e G瘎$"eB@9 tXTn-W-c4Q:,߹ HِVLdNIC3Ql &)UDP2vH2vE.*eN"QUh }YGݥ%jC Zwgs M%vv֪ewMc{ɇw?C1KW9?]y8s=p}E#r|˟_Gv~VuT:}¦ EK) "%V᭜9keO#0 _FGb)=%yNOy1,a{wI% agw~n]'j~w0]LF} ~/ik?FjomѻhG-[W73w~0fé7GBJagِ'67W_~]PѨH  4F yItuem:zū߬хh|Pbt&2*'P N:YBFNcbQ&化2x鳵 EhvfFrRPʐfgY'^UV[8ڷ^qoPV Q+k*SRrѲErEL jwbB,FJk J鐔E:b/LdbI(IFsSS]/ZtFz:Ջ-|g_3g^ncӯ./UR-P ` x6=B+9i"+!{ܤМO~뢔dN}A`5p`F4ݟO^تn'OX֔RJ9 ds^*>+|Ae}.JYm| t5tpQYtMj>f sL™*CYgz&ܿ29#_|.0H%n."t3+0M5 ǫ{«1\%?w( 6d9 Xn;r )}9E/>}ӈ챦s49 W4!zp1kGqg ى Rz۔,p"JJmДh)E>^`Ki*;9ԔbzH}'پ Sbv+I[Z{Q|Q -.*RIBP.iil8kpc#D *b `/InèjߞrvLCM0 Ƣr Mb\T%)1xJwAQx!(+BhluDR"xձuFΆr՗o@ZF8yQdJ C YebM I[yڽD᭖'WRrX݋@"pLQo' bErN ICΠVO/~;" 6%%6"R=БP Jp.T ُAmAdJ7cLCOlYk9E%ms.Cx# A4O 99H#lB1xt"kze<8]Dif->rНȡ j*GE+٧s!a&K.NEbk.T@6y1P<`YQ'n/l+ccz\ k gOtrm8Od!l&?B]TFI`GAbi<4l B jtXCeOJߤ,)״R]1TX2BB"1MHd&)# A9W`̮&>2Ȣ"Y-E.׬8f]uFSnp=5?ƇEf'"L6o~^,>/{:/8hO[ng(ɈY~nO +4g]U /QJ7rijPrD# ՛VTæY;h4<G>Ho}\ϟ;ϵ4J^Ov(**u/L$}*PYprqJrlr# jh;|>`!SC: Ղ݇!US?>.>V*~'>v/iixd۪èծ/k)l55hk\mP޻rz hs)k˨5v?^FUɼK9m=IStW<\@: 箒9|;sʷwznUBlz:ga|N4 .arϷ­,WYO6Zk8=9VfIrA dk7^@ xuI:7cat'qO?-n.jjٹYoS~˷d gK :޳`BBqe/7L+XzSՅ٩tQ/2]a\ODvdmqUgN5-kCrBསۮ>O<_rڦ:3Kkol4^mvsxv[^%hp$Ѱ&g;Z1@V{øMLxgORS6qn'f>Ajq"f'>4H)6Qo1 A J^#($\3xS9hJFbXYm/a=WFtHC(u9'}Ai N DQz&4Xc*]Y_YC747pij¾=75o[C/ɨ?`z1! a?9* l^ 6@95Pɢ%-Ijl-9D(};br{]mDˋ![XW;Kdk;J6P [b)ƈl4 ].GDDhNGi!m+Ҏ2œDVhyPE \d&[61[/K"ců"gg >?J{,5_Z1-lrB.d hr6Xfl(#{EEhFpKtv=f,2Yer[ ժ$dضĀy RZ ,d؝d hAKe(f4+{&{ .d+<{tF|! {v}c1$b3_tkL{a(ViDvY&TE:Q>IS4 "erJzA9){AVYIg0x$?1*]dv<_GFaafn+kVg5DSjT^_:_?xS[7Du|# \;x_Jn_ay&?`ϪF @Ymv#zxQ %#Yc_)RH%RXT(tnVGtٽ X7H$eHBP $&IrQJ#[auVA4T"dʃ#ehHۢOU޿?_+(dyw8J.jj~vE]8eAWn[~{Yy 16T. RԂpTΆ'ӱUnlDoד~=bUOV =Vv]2Bit':Պ_n=sx>+~t?uu-!k.kUuɱ+*IRf `R47u٭#K_ܨ_]7SKSKz`W/}K~~&Q_\SAlyhbgꔂTb$U?^ ٍѾgzUϠ _>L*SC=t Ltձz띮JONןakgCip5G7 p1%\ w=\ 3 Wl]rjxoՀ} Wۇ2} Õp+vMp/ h쮇y>pʹvqOЉݵ֍)O{\X[ӝ\OMF{ u'_Y_Z1GddふI$OJL}6 oW~p/k~mmr[^\ mvSs~ꨝn5me{aI'Ã6ۋkMTa]z̷;O@|c3O:ry0^<<8kk|85㍛`A#o~@z(jfO\L4uʅ)4Mv6{]zpxosC%HO Ot\v=\ I3 WaGjݰ] to@+;Ϯʰ_',#xj zߞ}l։9 xR6:Y}-Qzpp-/hOE@k\*Zr)ݕ%426(cQfh)K6߿4 uEo/^z}6k{e#I1hՙqm'ӿ_?=@y||໵<zIx੾=]^kurXveS']'-F&bi墯w\4PZs'W&{"߄&viLlj(*{béw+)*(/ τ!0Vױ޳͹r953Y8,-KCwDZ aoD.F٢ʽ)|WrHUEL4^m *qezуu{c G .Xxoཟ=j"ҡ4{%.jI=525B%&9@4C P籵kk?[ -l '튵\cS2jId| `]"ɲIA,1&׺޽bJٻ.b@}85UG)JyZjuZQh?uo]T$KARD(96LpQDj8Bi\њ߯,8v^b6+h*& PU㸪F1FBIF\+ qTL1aim˭ۨt^-gdl*^+ޱ;>Eз'E˫?LK%B N$}t1 T>eGS9$bhY兒?s9+O~oLH*oKJcS'@^uʞ4U%ߦ.9vgrmdw} Vna}룅1[V{_iRJPp_T~rJr@ڡe{]-~6 !˔~xۍ^nsh&x޾.qE_.bs{v77(g5'VhLøf^rQB݆,@M֪TnlL|K.F0<+!6|xc=N`Z.A߄/v czx튾C;QB&돽ыt9𗋿wjWnvr&kOs?OgǍbQfrB/ Q^?0maXfp7 nV5?|߶FWƟ<4: ECb%.6|ȩ)q챨seլJ;!2KQ/VG1q͊<$۰DŽ}.]> 0uGm×z S1&իH-IQ y>vJ"öJ uǚCmlnf Ch8g69W*Qwnz-wc̝smRzf6pwʡuSa&a)TCx{Qn (H3+c@h-h EK ikP&hSkr-dqv&n]V{Z؏h1ѥ} zKP9[]L·.ZRV\]!4ڌaZ段–nP1[X38fB Fǖ b%j{[@-Z%:ȶKڀg҃ ΞqLDaH1U0GB1d{ 3{)O&>#FcSHV@pèI& =y}&R'ۀ`2<#JcΜo_GmP8h>>nBë Ξm=X9ӐJڊ6ݥhr=٬dl^Jx7:|_Ȧd<,ɷd-lG6[QHHk"xSLH/F&HKdIK,z͋ŕ"4ڂ1d2f&rۮ݅5\UCVdYZc;!k=ԆR %2`4Z_7 l)!#<#+EFgAހkO#8=R)zGd\X "(+V,*>:72x!Ζ Uauuɗ<F0 ohΦYWgDn!. C2'ς5YlXkٌ MPbjG*khO XR)s`)%vF)]3\ jKX6\)ED6/ mlF#dfѕ`G$o[ѻ*hWrb;>͠j`o]\?1d aͷ! `!dPeEKgB،<c锡[s : >8u6NfvSF ` 7ࠤ``g£6@PD3VQ r ֑\փ?J  ڛk,,ƀQ"!\͑fB/:lw"=@w/:;V3i '+u3Rq&Ά.qn'E% Ár$%ޛ5I7A9+tga4V:'$'7"DN-2Z!ĄK;L|c*GoO޴tv⟧Zt ZmF ncBu,m 2M:V%5Q =C@+uX S!yG a/^,$t]xϐfF<@"922/x3qk0}^"[߉iLqf۠&k) ] H lR1vnǯ͙]D,ͨ]pL7Xk4"ҘzIz΁6D2$-RA=.aRِm+jXÁvSB^";()Wds^ ڪ":EYwmeqI9M A;֖`e;3Sl],jLYr\`VSEvUU`LQx& R/P0oB5)W%d^Jeƥ[0'' Q7"Ƅ ȓ#.P S~E^6XT>+ ԛI'q͛Iص^NR ,.qҜQÏ޵1;pEK066vN B{˩:|k"Ϳmhvg؊/0"Dk QF5bsk\#1׈F5bsk\#1׈F5bsk\#1׈F5bsk\#1׈F5bsk\#1׈F5bsk\_̵{ܶЛa\]!g@/%-$_e%pֱM%Dы/A/ת^«z Z[pC$aI{6QpA/t JX ec_1cllΩ«:kg6vh@3oǛ+°J60WVu]? "Tt-HV7D7#!7C7Җf=ͼKrO7plտMy ooW3XÜX0A '"QJHڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=Hڃ=AJr A\C r=g\)9"%r]K*evD_ _/3B@4B'WV_?& >H~nt+zZծ~EhP]F8 c3Ff x8w_*Ȥ2ʒbI΋iOU 'r2&Җf8wMe=?#Կ _ $_ #@'|$f0ێ_['ptuZhalt <21$mfIiU&0 e I+hE$bs>B}UuD dƺ YH+i !WhDH`QBl|cP>XPD?7_kZS.TA4-+A%D#LT4ZQR4fPۙo}L{xc|Yr_*4Qjx NF@! "@ Jԃwgi~/؛.Vb0LV8{+dZOz 7@ZG' %Wira&޸׋f "M16W$Z0K_yDgc:-'0׭WPZҹ+i)/Ib=NCȆzFT_{u"my~#kQXaswjY |޵=}>ݗ0* rϽ>M[&ӎ.XY$E:$'t=u>Lz>fQY~  F0bGlgn 3NOedSgzjg_2M53iY@}:M썪^rFxvSЕ7-A'>Mn6'{~pu/ߧ?~/}/?|3Ο`äp%$/qm3sg-{?]2h>qts;-̴uO''boz&_An&;秷g?Fhx\]S5Zrs\S&.Di# ք[r4&Gj'I;OX钊;qEF%V[ˌ.mN-F 8\Ik6̼KњϷs\qxTܞ6NԱFY ]u^1cHU.ʲX%Ѐ+b<ɲ ih^АM' vk'A`kv>pc%[;gObv D1V(CMP-:WT,Bp ( ʤ9ݔ߅s͍;/Hp'r!wY03^neJ9818;̜[AFH%DrΖThJg]%1-!DZK(uYK&XHГMig45Ӄh ѸS. O?1fra^@/ oVPbZJ B))RbKJ )1Fu]%np!2VWBAYKEa#%RPI`$Q 0)9LLKZ?,4_[3{ank_3}zqi쁃|ܗGc9ף}_q@|Pș n=Z֠wfnA2U"儝wk  $_ 2Q fݠݜ@nj wlP@5"twkM ZlM2nj1}NېBNޠ]]7ZDD$дEp 4m]G- ڄ݌wztVn[qhnxl:>޺7TZ9#7Cg~\@ycQT;3L 'F߁]?XW6JRGN)4io+' V8+FU>k}9_A6/U}X*KTTdᱛ۶oGܬEJ2MgκjJ8mkzov.uE[ipb.h9PKd/rݷ\XI#u&.Dp٢))p/zrΫb/[oe21)9~0mv >E2tnכLS1C&ӻ yB/4L\O?HZҮR^&%_KIMsҋV2IVeS 4nĹOx6Y"uebhS@N>s8l؅EinbFrذDz?yq֤-rox}?8LE-[ %#g#QGDdQr )l"iGJ;}WcqhMZ1i p"7B$`xTmL4 B, 4X5dvlU9Bꑥ/^8K3K+*ڢ}vo73ĠKoq,FWZ_#即Es~mŕܾa:l.uǚ+z./ё=~ZjxTzK}+Mn}o#>]۾I"ևOJQ:Q(`gJ, v."-ڮ7JBIH`lઈ+UVURp i3"="q ?vU"4=\A\3+x:\q<"i|^IR"\)ꁬ֓KTGKi#bumnq2 Rgbo o?|7N拡fsDgph0^6F͌U76sJ+5_"s鶠ncvEU =L򕝴UX˳"W$均"WF뾱{)\\ql^1i:\)5p&JypV.0(O/}ybXӏ~1X˽p5&a6\[b2GcI`xHc ǃޣ,!r)(`@Y_/$,JVqA~ o O}qHịO ֒6lr<͇av6|L᷆2͛?{x'wf!|Ϊƺ҆Q{xPξ5T:{1>qdzI#п6*fbiQJuL}:6,oITݼi4vYӚ}ԌKcۧ]&cIG?L eѼiXbE-"V` |㝇eU`L/X_/zY> W K9GoGL1˅IK6>@Hne)9!]֥Ȳ^gmM*cV2 sk \:M?f"nR^M{P6o7he^ﱉqֻdz7 +ҟ`O}NC},1թO ||z}65WWO5C 6`15Du sUU#. sP㖅* #:Q&jcRs1%%EQh)R=Y<)7f6rSz9y>rl&YGt((Nȵ+)g7\rJLh~ubBkbNha,Ċ&ojFXZ1YILWOWQ!*)},E:ȋr Y IuvL6`M2!eLPIߘ6}ck ; Y6HVcDV=ժF|ԥoz?x8 @"AD(] KT $RʈT]4Jjk>rS^Nw3h*ᣓ?%DFs42xg!H Pxk,8V/"BLu[̺|ՖHהo X$8 2wOg\t2, * 2L. 1dR j+PЕC ͬFАrA,_n=iD8Y%$&'J6ϐ*>S\i_߾HZYo%2XhnHy95h'bn3Arg" %B%;ZvkhWꔊ<Š>*eҲ$c!st́9V)嫢 1WDV챍>hۖ NuYu6\5U>֝ƱwN̕9A9e$EH"lQ T^&C` ٟdIҷQYm8}g[Ji;k=Yg}'$iљK`J{JUR=)1Z3ԢTW.|$iᵇӓѥɸ&EV *pLfVUgw0SBs=qסX}͇bqd]OK!fwVqfq,?.,mkGu[ĮO`qMȒBm9DbZm$ݡ,1gU+kwI3ٓMZȠ#ʁN&n ~guIazEcn+zrgiY g97kV#go>V#u'6h'hژS:\ޏ|~djS{_/TM'A@ f4,jbBc}YRЇ/9cUm'dMz&73IjjvarH6gKAv /GuoUaְכ^o:7\c*Ĉ:ey,р6j-@*is)!wjM1rf,jֿq8V Tv1n'r.IIiQpಗ6D @¡Gu xN,~ !7ߥƁO*_6u:|S(9_^'JG$jdzw)G4I͕}kdRhXSh7\)) F~vG7jL+g1&pndrDD$ s^H.Ȳ޵#?{( '; vf0daOGg[r,9~nI-R˒m$~tSdHV$2AB &⩣ 9?SR_[#q+Ýz[YMgP6ANܐ)r \l6Өb3qF4G_?!/=*j"Y}@eL2gEhVິ6+PQHli9& Fq)j9-wk.w=g5Y;;5Sd{Q/U62ޥ$% EI% f щHhrh傲hᨴ2vIHJFiYmC;;ʀD  vRYϷ)ô|)&[ rF{ҹ{ L@ MJƃu)iU&0 2F#&"#2* 6>gWCԄh3ӀJ7P%=aJ <(TASV*hQ"K[m+{N 1Gkpp [HUWf/ne+9D(/oԨ7-?4.]Y߼^}?8URsOgOrs33N?z5Ҁ!M,_!M-)[:oj56cjlfYޣІT#G1d7m{93u[ed~N6W:q"#aU_ c1HEOA1?UgrR/~=9Zma#47ikXV{=fպ.7;)Mo^qKCbkt}kR|Z~ٵp)& _?~1rykBu(n>]Wfsg?of9m丧m"Nu/F *Ts֮5weŶm}gձ(WS׬7YnS61FozWgQP p\ /Hy{0ڙbK(hTZBW\zy7=&ӑU6F)D;MM.x뜡DB0*QԉMnpR9&[;ww:buo'G9rK=i QDIIc}$j\ wF@av((-Gp(7q*N)}˞y\DIT^/sr {r5>uu2HmdvAUa}\XRx k7lg+y[ED H$8B*)'S( F In0U !9G\j 'zCd 6X[J؄ 62F8_Kb!o ykO6Mh5~ e5|zGg l7[&$ieLRhby.jEq>1ܣ^ͧZ \ 铣]F})Cl`;8˰ɥ *GI L#xێ9#Ìq5w,2jCڝ;}ʀ ;#IY'@KAXQGIdAGDxT;-(!= &D%L{cau!@<QY2FxX#j[eQC_6/SY(}µUA qBQC jZ{IhJTqIRQq5Dei@q(|eDl98qq8IMָdY\lv'.v86R #9ZC\9ID Q*E𡴄1ASƜHׁ[c9ju{$z8PIT:$7q [UgIYh/KȉEVjr-U"঳\.EˋA|R4 B$/8F9$ʤćR@ S]' xbNzcY2 bԚq8 Cpʪnk,?v18- _x?#M-n21K$ C^@ˣcFRBdǶ*+/<}푔$Zl$[ 92ЬEΙGYhISZ=0 Z#BiNh9O\ D3ɰl*h!BsR\u:S4-HG.r I:LLr\rt\v;`ܚNƶI.cüY{ѹ 53snp`ͷ?o_%xfJJ-x+ 19Tzw zMנv;4r٭աԈu-G9U%4Qoj.1Md$N8C<5nnyҍ-NO<^uUgެ/T=^M"6X͊szd`KX.]֥W>R#ɛ{nhVbKN-^WM+(qG ?\s꛼oy_&-WH:l͚CuʚO.j7vg=Gٝp[l?OeulsɈPS|y0k%>D? yL8P5{W{\oJ8=|+bjI1Josȏ˸gl0a\TI‘JG2;n3ubweFޏcTA6uF=m}! m-7Oԃ<'MR\*lMGէ_U'߫3&5y[G&3Cl8]Q+WuӄW`ٻ.f@n,WSRs^f9 xE`"J∎1.Ex ž!40 `"ߤa~Βcoý5W{K *3Jvpŵ1r)r"y^KN Ψ7$ -!|G1Zu2YEEŨQI+*ѸUW95ʕi/bv״]s.2JONM\RRy̰l#P9Rg-h{.5ٻ6$Ugqs6A'XఆЏj0E*"eY^w$EQMH$j9]SU*BM܃ ?on"AdTS9o *we̾̚r* ,bs@.O`TM]_"d>In~aW߾?FFQGLVeng+˕^ ޘO ]e^0 Eu[66ZJsssnΘ~A.L|918\bdj-wRH]RVr|6wa/zM)+̅,="$) V4EfdY4F#[%W Ip}9'Gc}z']Gc{ 5'ɭV,!9B F!\G),X( yTJpB&S#ģ\0(9D1%b G+FΆν 5T_)E>.yg3"l6KjnfJ vӓRLOMTx`6&dwRv 曁DoJQ,KQ\8VgMO&0QAqB)]39KJ :RWŽ+_}ruɹVmnwB>rK-nYnX;rގ>ȭ{I1~ʅ/h6*Eklj͈0$T$m:%NG$Oގ䪂sIIq\yJ4u̸HQ|v2/ظW bx`@p' ^D4XA)Ώ[Fy`+dtS :X3>׹ >}<-F)PUL$HZ1[d4^xR׊$R$%)iIJD6B`\)Z>`륾em㋊ }#7En?J #Ae78/ڷ9R %x#Ƭ7lWMҽr_}pbw4jq{]EX_閧)LOvLl\.օkxsX 51IRQm jM/mf;{bl2]mXxNY0G2%W/ŗ/ip|~$ACHBxRhB)pw=a&&HPlF7ɾmloocwwyvNc0Lmx ɁQ8) "*d@&E{_s'։Q (2t\O`zqHggU7n"W;:PՑPApu >uRǽOy=)/a Oyy!CT^Nb]H!g=OjoR"g!}=B#zLSj'!\Ʋ<"g&|,dyibX-$7ufA lPl+v ň|w_|tG;V9j|JJ(5^ KBG(3HOS@d| 5:\h8Axgr*.AE$j\|>%QF"Vy)@&S25. XY໲sn#<ϯwW:oWkDw) yEb\#!' jmd:2DV,M!x%G`K0 亅fɱ,oW1LkShk84T8E0W% (4'QP۩{ACa(dcA {?҉~na݄Lhr#( DE']2EJ, STKF9T# 'ABx!u! iJA h=xn&2n( mώ`:l!A es⭴A@I2""\i>;rvƻB ){52NP0[r=aVBCpaKb¯2u.&qy]ՉE|rѲ3ƵbMG+1 ĉX2$kk\.c0J('k9M&pW)*zvo(N`=uU^@p[I-ِp2)>.c!Vg~5JiѼ7fkׇӳK5(Z"4:aVwP \'Npkcň6;o/' ]bL\s Fi88>i]rUiSbr0z#FI=vu:Dwv,Ph#ƍ~̳Whx5ut,2 zV-z HX"m|/&PS!=ոwjr yaQ{ dǿ::??=O?z#ѻyw[\q30u ɣIms~ZK@c B7S/oG$̫y V'")JgA*ο#*9-y3w&~hZko57bzkL!w[Zu46*İy#f#ـJc.y'!!7틵cJUҝIL|F(*T𜊤A"Db%sF)U:Hzjo\ԍ|Gþ[2A{fC-<8ier Dt\X:E6u+OlпQ_"ۍl}룖n :0U}/evPbsK9\y|O_  Qjϙ%L'2+"6@Ыm:zT`G*y.S7Dİ1J#TYDp~Б(FIԎe?N<WVs#k |͡@uӧ=O_l@֓%OeTH ÿҌsv^ Lӌ1\z-"3O˷xF8:.7yߵ2iϬDn|7dXuY&]ͩr%뚫%JJhi .CtR+eE1!A6 x4$hb]>U>3p'P]//hE cxr(vT]b/QGhvq߽݇wh|9bʴ 万x.`z;1o.Xwq5 $~}}idXAf_߸_&b oKv-cΦm`-Ӓ^v pz1ۇZaeIsh_hZ&Q;A\MeFy }f(Ir(Ead L } {KDSYL=#wE0Ǧ/YHcb/#2p5+i6H<% 4ies~yI .TGNbsknT ڲ|*dd8~4XGьKn&&ި,F" ^#\ĝ]} ,'/=hz~8vY~}!ON:yoIM6ɜ;v_ߍZ-dT&($zFw:i(Յ`*ZQLqTes)S{ʘ8ez %&YtMJi*(%TR5c1r׌Qʳb.u!u- zM/_#~x ?6p=m `25J\dBP4<M!1"x@\ Em SSVc#^J!.2s)O&W:"jfNhcU)@"A9KsQ\2.Fkm(E{p*"0AqPRŌ x"LRiTȐ & 3A!p58$Q ڹ9ƨGfŸ/lea({kĵK6HFLkAN!ڏ0Bu4EPD"+,^cʞ@&)Z> TSigj{㸑_m⛁ppvsr"3{=3zf49]U|UJL<\cL-2e~ԑ| yHfZXظ]/~qcld1 7!e֖ ?4AhLJĈ_܅_LJ;mcHjz~܊~5iqD?ũ8~w!4R"|Qk$hLǦ;dEZ;%冀xk%=Sr(YB"+vԺJ`):j;IIgDR )'HR&Q<" ,D| *L$EZ&^( |L|1m&rc=Vg*ypN4X:bv5:0cӠ6 mhWL.BB"1yf)k$XrN$0 .!,*fK{2Qj3!D[Cz ̓Y=z[a'XƲMӼ[R&<%h`wTFn#::rZ3tDpxK* I⥆bd}R*vaDmBvDF/"]ֳox_ρtgm${,maTzt4|֞G4 5=v{anavn}Rriy]8b7gw "B:`8SMn_ڧʕۧΕ'˕DŽ02\X3BGB$\DYzUy(@/ ZR:$C&Q5G YZ "(sF?`Mkݣ8o{!Y6x7g<ʆݳc;O%)=~&8FjԤJЧwǃ٤oѸ*ݻcx `̈wvt_  Xб~5cdMu=5D*dՋ'e?КHFoOEi ߶KONEe$OeoPG€b =dTiJ-M,?5 ]_|EKܵzW)qOn<`rX ]ok fī{«g8Z>5#%(C[]1y]D3gN3G3fnDY y"G(B.xw@N,$S[,m2ZD Je^`!F!U'L-jo+˴k^:1[wOT7 itQ̈ÙL%--sH36Ka36B"H(rgd CT] Ƣ{2Ye4qSJR@1 9pX(AYP`,VN+ Bh3&f%W8[YoW^zzIQPBe21Al$mcZ"z~RbOZKOYAY6)X@/CJ\HHfG`Y15 `. J4"q3jh`RCfXh@Eӣün)'B!]?Ozdu a̒]KR( 99΅:ԂMֆi#E`.+24DZt!B "l*)ЈM3%`.JH9E+S$x!t$L."|Lb3q.3!ug?> xl6X m7{?;*v Mԅcux\4]@"1f Z)m(A}TV9@4wRĂ7Ki2deSpA^q \AZ Q$]@3q;q U̔u9k־&zS,+rOcwY~'onk:=SK='?PŠQW"uF[bgTcGeJAd' O?LoٌL|ei>tVjVytRne,cH<Σlb/ڏ߿z7<>{ Lʕ />W?/ׯ.^۬GO؏׫ݼ=iOj&*NeU8f 9̾?devk٥n+n77\<\?w}'CvvNuwӽ+<;Uʚ>>-s@nNtx5e'ڴ?{o@tqA8rLa؅f!q>hZI4I6R .-Ey,yUa^RT`E簣 +RX5;v.`t _cp[ȋѤ"lƀdLHJ$ȰsH}]MV[̨-Pi[ -e\I"V`t(m90WLԚѼ8[XF^Ŝ|zEAaz#6OS}D 6Ś9γٔS;ŁbocS: &c2FCE$ +NM9")or̚4:6tvF׳yP?:y%P 6Rl3昬IFXD|P=OSzHGUo]g`Ϟ[1  ѬR(E9JvS$9WYUblnބklv[8T0tPE#i_@XB6K-x5堻Y w b*1Re%MQZ";o" /2Vn;҇ؗ!8Lˆtuu|L?-tNH4 A -7kJ6VvTw,R:VʚZ_/[.3w9c=<_STp>nX~a?Ńu-G-׈//+J`ƳMEԘԎBdB*=/ +d-%x_JHY8KOV^LȾ"J5mp|`5{(WxqTI%d.cL9;׵.pߪyti398v:n$\֯6#|l=lc\8x_WoJjXuOŪx/빖_b}E]ɫtV&' ,zW}\Z|-0KQ+>~ صW/Z2iܭ/|vKfbLxﮅ-8n/yfQ/˂/*Pkf/s=!_BCo6<.rY*ukWZ[gd1啟>NOV_zPr0;/wo,2V%jml k٥}*=,ף=7/?AT#ݱ}N Nƒ]:)vQ1H9E#36Pf6@5Bq)=Gf^0sz\vs/~!$}Ai 9IFYT F?`ƛs0_SDkviyWܧ_?'#Ƭ '"f.L^ +@?E` DOpСV uwm$UYdGEqh7g^'@N 2cIʶO d9"mQؕf_b2p0`sPl1K>BS(6ͲYui vh׏v03wulP%2Z gcyI`eV҉؀p4 )n0"༙Єu+Ħʤ0R^F-P7qBPy %Sd5 QT\q[ἳcY2Q~VǗ <-&ſꟙJ쀧1z̾5>i5_UNӁIjowfa%߇bjJ }m)n)dB#L9DRP;6qqrW  K^N[lB\:{2u_ü<8(( &Ur,oC/ ЧJw>\"Ujٞ4M뫤dQIp@)ױT ljh3a m>\>*z@|SӴtQO&F2VyVxz5y7}p̾JN\ϽAsK|nD`8xq5&OBm-I[hk6lmfX^0V f0bŇu3yd4խfmUI-fAFhx$5́|<T8(Ilc\ 듎 Uja(_W*&^ו.9/g??%~?_^D=WϞŽ$ m݅ߺwTjN̈́q5)^\}~:?.y_7~ՙ@:Ӆ |pQ;ƅV:-۬WWD\MC{MT.Mg :T![j!]w+țt3ɑv|0_f2~R?&+ީD#WH|mL8K1 /<(/%3h%Y(@+SAIokC#]ʊ}a!ii;NZc$ϱAjʨ3\hH[#iG4pPg墉 -6G;{﵄j[NLKpa8E9Y8fg 3-dJK)FD§?8/& mXv[+ ^[L I;Q$ (#HρȤij=G@ykL ec9cL8cϴǿo1yc =Ma>A/88\#Mz1NIBN¼SOK7xMGSZ",p,Zig^F;=7p*1ۙ~1CT(]*nq]}lq"vq5P3A0ɵ r B($Rf4j.X"i9ZI}+PO +*ruv.-}%m(VJ ]Ӊ- >T&㢸mJwo2ۖ^׃A6}i+Дv72 63xєȬܞ}O [ޗ7牁5Hz^7Ela?@zc*C>>)Wk؇ ?K%c4 L8hq&8ا0jcJN vExGq`H"kA((hʂΛs[`B<UX-zg՘IDC?0Ml5ͭlݳRdSDWA^ saN.䂲4FpBx+؁oi Zee$]8 jxT>s>`.0 L" T)J̨ٺQy YQJaF|y/.L[<+W:WN49BAޠʵ_ㆴ['~wϥCNɸB+„ 㨡[72J0V+gHAAl!GkQx XbR,8b-&x#32x5sf"ܒ1[wK|X%,FBY҃,yYUB-' z&qJ?{.E EQp %EuK5MjO^j-\X؀"iHʞє$It) ;h/#vHHLBDNQGsugĶÁ_P1Fj碎M6,Aj*w,<`wPDi,gJŝRVO4P)i*`BJ0YaBg 4d@h k|@N@C+{@Ҙ0[wJo|Β`>6ioqUJc?@߁"_jYD 0˖64" Ƙcdtُ ֥ ;/_brvL I&4 sYt5pR:S.6[oMs6HBBL;-ǚ)X<\QBE X 7.@ǧ"@ϙlK@@{k7U7- CnzFc!W4CgRNvWE ͣmSsV[VUw^T;.^SFQ/d$idp^MR~׿OAc aD*y` @͟ ,?גXǯeN.,*n݊?~[/CBdEFe:Lw^AWDe䏑KGZnB=9KF/5Ny1wcAs#nc =1"ưTNKA4&9On + 6 ^G(͎yBjS罴z1<%} &7=~S~VumvM]{.;k{*]^VWj} ńVPBJWO˺a:5ʬճYU%K(^y')5/\M-jm>.4)* Vbopp;2,z:#fMѮ qKƉ+o;.o'wu_b;m*eߦ|[K@A-(c4Z G'GM*%Md]оN[oe6Hv㶶}O7KvHU&vsnV7Xk_Vcepݕ`pbi)LlPeE&/i^d*O[0V` ;۫)9!]֥Ȳ86ȌYc&U*ڂӗ,ԥŧlmҽ $Bg5kA}ڻGsZo{L>>5)V#zlΖl#?B9zV`1UdX sUU5;//saCaaQ3$A16 -%stAg@ۘΓyp?e6rSNw}2M*sGt((NmEG|\rBw<~yv~m 5s;^yX竸ma׆Km>!,tmF`^E&D"\ h]p-pA%µׂR!F$ Zw9)32cU,V#qI&ʊ:t.+&t(+& Y1vYê8dŌނQCm9a5CAy QrfJ$!x˘АrDO'3aph V 2΄|Kn#ُ:`|ywEeR|RV4ɂٜC Q+)j$'S QeHc V润:dAώCӡLܔ̂2E43*H" db"Zt6LyNh,$ȩ yPAF;imP!@i>7)iO9+Zֳ@i@$B|N̟YV)edmo1@Jdh%p1Id5ZZa9O:]&MD$3h,eRg+6$͘q/E9T )auXu]^D&nmFbwmEzp%4g~-ۃZ=i?l?Lŀ |tO 1qsLYidH1 S;0h[{+"b%ޢ+ךtMD+Zg6ANs N&XfÌ)7Q16]IȮڡ[A/eb n=o+"ڐ(r Clab$SR`fm}+h5zl\Ai c$ɧ6TnGίEZ:}ߠ皚W-$] <℉Sߎhޤ(<+G N@ iUΔqVS^, BiP"_!Mk\iON?tU>?]v*_#]Nh7tUbo誠լtUPaث+ݬҜ-[/[ĖG5r#*t&#߬sbwYJzğpWھgA;OdzdG @~hz`D~Xz"\yoODkAi(ec]!{<]tsGtEU+x_誠tUP*5++\H#* ]\佡:o]zt%h7GtETo/tUb骠u 4O`qk/tU*u*(`]FBkݧvlX誆*hNW_%])a]^ +Nϰ'8ҊHfOh&xR7>Zi?6;U#UI#Gi4GҘ/ft}t}RHJ[}Oj6LBkyZ`Zcpq5 n9Γ|-k{kFէq+U{#sB#\a`*N>kwIJGGĀr~>{Ib6öޒ4IZT(XXcdKxmޡy>Y#cg[cSjY%:*b&ZcD E h˓+pMo{r%Ԇ駡D=++Ti%Uz Zx骠^#]I-]C eK! &۶<ݚDh74Mp%UY>i|WHdT^}b EoUYίJzt59#"O+/tU;]B2l<ԾtE]Zb((a7{t=},l)ONjGn(bI7ko7Ǘn~L]iuӖ_xyWmV1YXDCʜ2Ise$͏n9\44Hơ^E0wô}Q(ӂ77Թ-d!{d$ZA( " 9>!f'ǎq}Yl,,7yf%p3Ah9Y9>#T*ohiNS[.gἺ#V 8u ՉNɥdÌ7ܦ=rcp[8d#"GwV8+m0N2.sESz~7ͯGdIrX)3v-V]vq~b:%3^>(q9\)n YLM=eJ;^@f;3\M|\Q)?>ϟo?:'\|y" _tXnS[8}URyf]7٫7DEM͸ʋAzI9ijVօ$74so흤ִ ,4 ېd6-yWL =)!1ȕkLhYQ[󰢦e.xW _G{k-^NNk^rjv9UK>lndC6|6F%ZGeEI&=j* ŵh]ⷷ=eJFPYarA8YjRz&Ɨ"Ct5%6x̨1JWeȖL)LZcKNs.I͌y۳UZ$cW.-sYpmyQ,!ne\jW,w@&٧x[e,TJD >͵!C.Iޕx8o#Q[Bܴd/en(ΞDYBB+lԙ|;fl !ei[0c46\ΰv[ұ+kcˬk}H\E2LH9yFo!Q:H$XɡcU>RiC&ː#yA$M\ p$ ,# T'\|vgi>T+ؚ|ʈeF#xăt D^$@~0⌓ziQ25U(Js5B.<JB4Ns t583DCZٚʋe^T/xo11 IΆ!b2Gc%B.*$GY;9%nn| b9^|??]*&J`uwF^z'Z*o6}*:z 8z)i]EfjhdRbm?%:6vt}&UVIt9^^mp`oJ-UޑdQET>b?E  M gRrhFn8_5̞4/hW{!j<]ɘZXtawqg\Yg|ix{#*</u[W}  `R)شī[9m| A$dׂ҉ lX=m)yDAl4 yM.!'4s%ݸ9 L ^r-S*ۤx̠@K PN96HQe;'0E3R߲4~"ݹwVl@!'TO_YхhpG %F'lFQB) i|q^ %YSAK'MbQ&E$UV`4"Qj̔CIfV*jo-|[U.{7,sη =#? x7K]\`ӌL$L%Z/5%l( }S:MxWmLxVW 6o&gШc0y|;78bdz;=oq֎qd^:厖].:I]?ZuNGĒP>y]2V.K,&Ứ Z f7e~&/3LHI)AwNή?|6R;Am@y=ڀٸA{I?2O-ʐ Z]NJ' !Eˬ'6] pEC/{gUr(*P /}sysO׏y~r*nr_*&]У}6=4_|WvPx07<| lE!eGC`q݃gR]|oPaxxzw* VUv۶OizAHCC_sۢ R2|ߡBkesַevԫUxmrhE2T }:'bO*=9}nGoo|cһ1_/O#;&m|c捑j2jSw 7zKj/=tRԁש{8EL7o]#)6bsFVyW~;M /Z*<!B5)D(|p ҏ7jDAENJ^[)%asIYE&QSQBd÷m{t>($uIM}$ )&L`LF,S /.ozFU+A^KTHS[*1SM f«W-~=U˧FtM$x!uhK` 2&K+555`kzQC+-Q3 Ѓ!`,zƝ%A$" =Ԗ֝oZD Je^`!F!U'fo&؇io?חYar{b[vEYtZ$]T,3p&uIK ⌮- ،)56܎Q,-ߟOvv-8^uR(/ d$ƥN*Is 5&4 eAB"PX+ +v eRzrΘ^5fig]}f+l0C|!y&[eb%p ~^VKDWRrX=@Ky/r{W~PMʄ!(&'OVRmd<%șV@fUAt}>&wS|vCɰI"PRE]D5]J¦ CYZUgpPGlJDZ_. N F4u(7&ލTj RF! m)"V:ip$jɺ a*6ybgWz,zeJg^@ܭZϒ_RyRz3.J-xLZ?z FnqNSuiP@lt>8!3E Va1Q.y0dJr.( [!-D*JI`M:%X} KH4ssuFrD W.-EG=g(-eƑh3q!=;qf_}{R. \J: ^z}hGZN6){`N s`+,bO mX\D'+sѷ[CV]wop%*h lwQGԱ;":T0xJW)C a'\(R*ݷhZL #DN?羽qSNǢ2yK=5\BVw t"%v"d2hPsq.[!*2Ku\tr(WHBNeHp,B(7ƿٓ*]^-~/Jom糸-]^{ivʈtfwo]ɡokxvr-h"/%o*AK ,ŨhB VpDƇ8yxEï\FVs41`(C 2R`3j Tw2m.$7KO,"yEiԘ))`_&Cj*k&Uf,ɧ{l_zV#v7WX`Eog~WW3z6/x"ML dH:VrYERl=J59iryue1"m0%ɍrޑ>F<4^yP {3{?0''/tPhIWhjZ?#>c{<;of}hFcQ==\rSIzqyYoݡ:}8䇿+Ӛaq>KN(:)Wl6ff[W{];tv+lw"_-Oﷵd,k i Gr Xߞ} v%@bCMxZ˱^jlː#w+zsHa;G%'y8=3rNo^c) 24?%y74ha _ᡉ5:~5Q-|=PakasXԔS`G:_1ݗzydFPj6kWwD8z+<_Wcdq6uk/NrmN#xŏWߚwTH.U+ޢ2̻/ئPћ (n4z!N \Gi > E軬)~t&8㒉svtb\#?Y]Mgώpg˿fD{EaȜDFgݽ5_M7]Ӻ 粃pr(RO'LO+T8׃s*E:wdz[h2GqV4oM#Њգ]otNo`T=3-Zˍu<>o'cKj2g8'fmלDd%@o#i#xHk9a7GڄIh>MƣBOo gI9`G]>dרjӫQ2&7"/8AQ&_S(viOkp%[iX"4H˯ yt]t.S7 ?F 54ZZeO;kmGowuZqYjUEtGA{ߕ jY̷틍m4Тb#I `5䐍`^dBԂ{!z/]b"YR(i6,Kӊq8<$gFH`7C4KL{楓 WNe/0c Dj sQD]SRjG_?zSv>ȒAw[ރp 1{C^#>MlJi]r&1&  Hg4Z:^9?OaHN ٲ*=8TBlN=ze񄘣2S!<2:6] ]DJR:BV'7|=zviIioSD( =\H@8̨]9 Oآi[=G߽{wxu_s"xGFI)+IVn2Эbjt޺Ke1꫋j pP8Qr+Vl(+C ,aȌ),V.spt\e2ļ8yQTֻ6 scPQKOB#%tK8 ALf^u٫|Mژ޲|!I+@\e¾ekU6BS_ứ Q;.G*dbIq@ q"@\S({Gמ]wt=ÊeltFq=SVD1q+G! ˓eɤPV җj ѓٮh\f12Ef)9̉L6,MHmЕj5oFxvq^vDjEE_o'=.Zwp ͜ӧ% Y1){דY%sg6'e-k}۶3+b?)@u3o]j7hڮwiyq nek-7ovža-qϿㅫƹgrސښ_65٢X]vzWsڡ,Yn+e\Rgv*:ex>+"2m/i{=EFX+,4ѰLYO541=ɖ}!"פ9X xLrNCWbf2)g26DEbP-_W e*YP6>y#GՐֈ[W=y!STrV)6s>蛳o'5'kl@qJ,C~BģBM{?!~BD2:CJoq%ib~TJ<0Aw)KUʳclU(.qBUVy8򐸊.rNgD؅T Yk8 26PpnB>JE\{;>KM~1wv'9qCWC  /xSATǫ$H~D'd)K Ij<Y3t"|G%jQ9 tAf %2!)!d \,5veŔDL0Q[c8Ƥ}FA[]@cN#&(@cĐjَ(=zaJ=:_;1 =ޫ /*U#!scFV+'1֌E/"ӽV܆PE$c4qۜ:ɨJ ZSR$pFYYͳRh8mdJDA'61,yn:d.tv+VoF G'F#u0mj2s8\.s%gKZ\F"PS_( zQt>R2meT΄3>EH@ O)J@uA8Y1ҡH*" dIT`H:1AIJ+J D{W(g]mOz`S!!yES䙄fV{L(wʪRX;Ģ]/mu<"4„㫒3=᫶EMvR+FZ3"so8tGHh& %a\TYr FcV) vcT=:A/sNNjҴMVo#%dPBAlhy\l: Ɨq |iAUHڿ]1x1x1x1u☢.,+5™+bB*FiꤣPB3tv s@)bU7dt-s9k߹ n~{=䋥gŶ=:z!tHKWTLXë)Ykk)Iz2K.cNg =9Q#@f\f24&.F0|NQސ{~nշ^/_z|]OwꋿF'N~K=_s=_s=_[,V,'W㵲V"w`G-Oh N#94i٘y\ގN+Yn>ػb2ۣZKi :' 2F:]nb$EJ=1^b ,zC \h:h 5ڗ6[_E뎥+'~{qWd|v™UJ{V26a/ë+G1X ޔr:a13NkSG \Ē&2&eAr݄Ln KS%J`XK4EA*).pLQ e.զC~Hkq7^yWSBtZl'Q\ W3o0=?s+I$mcb3NE2+V 12!IQ%\O3^t^5# NE'HFbs h,y8/0!lvF)pQ'wG{8~a^`4rbYhskb"KL ' 2q2|wͧjRXDՅТCfw5CC& PT۲e0Jx/htiaq5N-B’mmTI(,0Y PRf-`6W)jsG=CnZ\@ X346}?L"ԏ-@.x6E~ aGmF.Wgڻޥw]0is&޻z={L߬٤SrNk0EX}TSϏ'L h`:<}.\F{;>rrG%x8<Ë1>%`ZB`/C^x~{ t3<jN&ksZfOm6o鹏x0OC0gz1J}݋]ݛ]Wr'| gU4ISU]Zwmj$k#w̋t[tT"?_C }HGig@`xQcC-sCZpz:óMs9Φ{ӒcaNxvS cHDpf/Io*{2dVg̢芉*{sBi tTN;P<.e٣ R!DD"~\LHu u^ف142Tmov(vtcE-YnK6FG8 Q S;j4$!T]RńH6NnîX}&7S+̐¬\R[Y)5O 㥉LZ}b=%0#=(tV !i4&*55 ol@L"dRϗ9Zd]fP9n54+N:Y-Kot4|W~u֩B#aMĔNu<7!xVv[%Ah*:}Q阨ƅY|qp6m*6 Rd &ߙt% I##EH7hT7g[!c3hAj mӼGE,&Ff E$1?K&ڻe/L+2g55:B@(C #rB,+TfDҴPF8fh}]`M[Edtz7JeN!4 歴H  FzI.;\>Sf?k~0sAKTj,%R(D_|SK +f||ݯo_|ɏ\rDžVgi}fIx2`Tӓ/ѳ. 2 ;JaύԷ_XrXcɻĬRFsݕ"ڭYyJIshy:t1Q }iV?5 t_Y~''KkZ/ZwQ_{$MҧS'>F49}TU{{YZ?%?+ /N'_\$f1{q.̱~C۶žH5~;4ɱ^7n? #cJj\YʫK:^U _Yeze1-)mLaOԂ٬e{k*g]Y*gח:RWh楎~HsJ߆P^}:N0W6aa?/,TK?ۗ*N>p /_~o޽>|C<|8PRЭA[V97M+ӯ/Uo[Y /KxJRLEG_7+6=sA~y=Y:EwZ0X\crv\>/7B _6 sYL$<68ܲ}q}&F;8>q'(@̚3.2}PQ֨ldnMd.ng#b \F{mö[2A{ RVNZO9Etf6t]~] ct_6;;[:*]vU;qJ m犼9MJ ֓I?wc3OFbm:ÝL;+v^rMuZ3]6I7t'&I]#0%)h,A4]DkK]4`FE+=hbE}r2sC-ENtAU_>A5l?mU ѳgU]͟ Rdj Ekbωכ,Y:*.܈Wu=Kׇl>XQ+hTTD;Ta?T?~UOzHJ[ŏN%sB# 0(> :18ʁxUUz <%ZZ!:2L,$m,Gh -25ZYY+GV'[='CF3Jbmazf[dyk;@TZMh(0EzTӈϛɵj_~0u@YGͮV-"3`܂PǛyڞިʙ0WhȞz2~/) gqKRvOjM$ն}'4܁%J\@hk!Ynwӌ 3 %MaE1[1eYYJCr`*;v3Al uJ.%%37x6yo1,F.Y\u'ҙ8k?] *ϖ|=߾=IH ;"g keKdf f`qt5*jB-/Y@dFg SL =)B ckLXQ;fEMGŌGdEQ FgumWg+LFgͶ /6+b]h1|o7YhRφؗBݖeT ˔ R;5qI rW`%c :eK *)!fyt)($L 52v&W ;b  <<0Ѯۋͮ\g0W7~{C~o<[e,TJD > 9C$ms֪=nRFjbș(Y5yMtPFmԙl;flhZC,hu8ҮĹpḼAFǦ6P{`!*1!ud)yFq#1ȭ EJ&A`GOaJ$ ! YA$MX 'n"J|2@Fuus36F,NmAǦ(;FDC"^{d[lD^ rP Ls?Yo3HRqCp)"Z)d'lVDs}0TRM|G- cDLˇD] dz4۴(;>*wxm11JuNrsɄ sepw[c9w5:6C1n mf\+o?t$E㳽Ya%Xn2G/_}G9K xjR\Y#i W؜(g%rۜ7JhwGN$r5ĒMl4\G/\k2(yH'b!朣@6BtX!ܫ`]TP0BrL&!PszfPINK­pg⬱?$w2ٖfDPOEfٹly.y+;zRnW%۩c%k I(O#eL;A\ JFbQxٝmR} tI'%^+lkŠr- ݅Z+?},,Q$aԘVOʸ;}\ RЭo]zYg !X J&9ɘQGD<:]Y#YY!0܂Z Q $H+[4Ekg<0<""=kWpU|,pUpUDWV3s>@ã@Nh\%]l~Vt>O|B;'t>O|-qRJ@D#Y}K@?0Q&E[d=J_ F$Ʉ䐥`L8$SF5 >kE~(˧t6̼ C~:nۅ?T>0d foWj;^e*S a % YiL9x Q {u*թP'TB u*#bmY2l2r '4*B*Ɲ%QdD2r]M]5D>!RN)]29-rߥ}{#g?:^2.LyDZFR#>{,vyt-bmwizeᔪsNv[P㽇بw~,Okl?Ug /m[5mmml׷퀴v!-}D+Zt5os/778A@g-@nN/nVE7 'pKeBmv`YzfLݤuxY*Q]o5~4X9%p+ GcYr\KMxRaO"[yFm-QkߨӨt'V{G?j<{ps+<\K>_fT-T$ *oQ豁h\c  Hm2OG ^aƎGD.)Z遂M1C< QHb譪g:v96r&Pgnl$ ҵϠT ).hPH(zMңY=F΁nS o5~=#ug%~~s;$aڪSM_.l ti?!P$$RAi'}݁~QN "0kN{Lĸ)ɧ,)mQF=$|(Sd# Uɢ _wP (]Ͼ[2TݦX+OyGdHӗڑHߧyz9]7-'['JF=Av- I{!_'gJ(f"gVQ1ʜo)n>nJ J>&Qo`Q`n}iZ䛇/M:F]^L^WVWA 2: 3Ĥk&K*IdT>|LF8F|zQ @Ct5T l.QVYA-+_r),F,KEی@3xd@&Q{VDrKa֊uGs@XK4hs].#|y@Z-W\`Ǡ;3ߦ1~V< N,A2%Ed +() uD.dЛG\/qP=g}pEY0^zJW߫Xdۻ=w#8K&aW]=.⎞ѣޣ~Is0~T(L8a7_>V#\ݴ9tq'5|<噼9*yV/X_]#`~{ 'vI炠zMNc 6oD)a僧cWtswtȥW8:WpX[fK'^Dכukݐ/ݹ㮎طvdBXvT&.>_u̧d#n;I QPr`, MݑqV3au6+\QaѦ#Ӵ113OG&aim^}Akf;7Fb:e#w&c:wOge9lgmdį;^2/lx20+}x s="˙[me|q{9$m8EK 5AHmI+N3gJ 0"ᵮt@Ԃl#j[Aݏzvb7qqTGsջ.D2! R&5`2~9dBFtܺ((ʁ5IU7Y*e[9 O^W}1ɏ mDFxkz@]Z@4W H-uET PNM? .yi- w<Wh*9/JIiz#*ĠEw^Y…u_'15Yu(ƤZ"@a`ŠJdd`BsQ)9Q:K7r[ 2TPX͵.YSI6 7 p^l2[/ >FkEs%cU=Y$~)TV}yjTvHvcB͂ŞEQXkfSf󲴸BqٜJáNVIUaT,7IhI[,ڦgjmaOC}M YdcŗjA ],/rɰ֦' ׅWW';xǛκ˃br=:(XMyDiEP9T -!&KIV#o ^<1xX>m3d`[еwR:,dٌb p`g'ASe(f "Ii}CL$ A{DxA0 lR: Sځ߄g_yښk2. 2L 1c E0Hs2QPĐ(iϭd𨴗r~9 3UVb yd-u+/؋qӛܛYWvd=>7^]'*CkH@ 3MJb hrd@g0J>ȷ/ hesHlXמp8|vorVX7b e-+T)hK*coPV?49=O:#bI}w_JBHb0 E=C, 1!DKnTZ+X.jJ dj()r'L-KRz])RcQ١h彳ggqC'=@C3~_~Zo!^\?w>mQ~z׾}g7|] h҄ uݴEw6Xw:DIL mnԂ;R J-!}WENrA ]є`EQr%)6 ɷmplpIO#Js 1zHș 4e(1xF07^eB&j/H#QSVHLT]>ȑMXkS-{`ismc|wMPX5A\րe+S$ad%r:GkK9U KiZTo?JD39ԫM>ŚE컻r=伯ǓXR_p5ufo^TZ7ʛ%-lw+/LH*(c,ڲ[,chY ( fm=lɹ(j8RJ2BJGȁ+lQAJjm2*la3ƶP pJV%)؇7>x~7tx< *HJc* P%%ZFc{JfT.Bj=fSjk|v )Hֵ׊;L%XVұV= ػ$>"i2VwHbC 82xuP*"G2 ]I mlHiBfd(WIe[I$.TO2sY$^T =lFf{/+a<L>vEԃE,g u(>i.{APlR*yF N qHKs4I!E^F%}1VlJ31 5ӈЪ`%"6#gE<{EvqwHfR]l|BL.vεE af^ )%#8om-LC*Ȥ$@ >[IǮ6;ۃ lv@[Uj=mO"O" n~|GzQ~TM4WJyw'ﯞ,*[z#]|Zx*d^N'YSp~Qb.b܏J}pY/zʌPLYe)WY~}fߓ'>O3!A!H7|6 =j{bW vq=Uoέ{=W=-O]dha;d0yы-_.jWjG{-?{:bi}89H};=y?5ϡ<=Xx\7=$&փ|dYeog;?>[o/<~=+,wqF>ۯSEh.0bބ 8%tHt?,BTT(k<^ e7&Yٚ=yr4iJ6HQvGmAKpw]|:NA \8i/5g*Slb"{kAn5c|TZžZx]\l獚z%Ec،!EV^Hlz^zC;S=ى/H\{j] ɋhmv^FΈK)Y!@=T9$H* $ 5#z%9'L@쒲YEcc5S "AS/~ ZU=4l?>W Si~]l QuHAuQA`/:5Sҁ▻ tP߬}!BrG %F'l"S aKQ@HZ(& ,!#iP@Z,ʤRYf>[xuZьFrRPJ^5 4#g3b8=yPiZÖ[:.wV|t7y.[iFQ& 5B_qN^IȘ(g4(o=Goj 5U .߲a}3F^AW*߂o#o6W͍PuEЖB-t؉xG.QGJd0.(Dۡ\1c=fF xS]эbj9kY()ej#Gr ,zIc`?u-flQm!E}5q9>)=c*`FI|)$!Zx!h]F@#D *b ࠹4wigȂcwI`,ZB0@$ƥN\xc 9pS(AYP`,LV@A`MzLJPC:c")jgRa_;/JDa*kHڊlZ"zfT$8SzP*ssC~dgFYHDC,;Q ڃYh+9-sRHj5Zi~|Hu!`'PRb&"+ %BŐ1KQz;]ic8f4?}{(UĎUD*pf_V09hƟr sF bD9 փyq]Di;R_}0-ȡ)f*GE+yMBL\ d]((gP=V`젚15]qA:Ogձp t˒})='˄!䑒M3-~y:"&[OG #E٠ tqy:5:MlaȦ7>%O/em[ ַ!h!RDMHd&Ҷ&C2 yBYT$}63(m&6[]7HCVr;q}Co\OSoR,1w!3qMyLܕ]N]tHȣB7thS"H뎍ͻ߳!EtJa1R.]=JNhdtd>i;l:l{#V7WI蟷 [_/,uiྲྀgn(:.u%L$-Pʬ_^3{Hڨ4^d46 dRj h뒣n:i=)rCΫͺpB'UPVL 9K.(;$"bbه >ه>ه+吰Q+$IXJ>b3 @/2I@Yc]D,ld9W˴F?`M?4x3r6[C >|nƒvSѬO4},M;<6jCnmUk0 0u%.jhOơpb)+YqMHm,cL_ߎf,֔#_iKmX4PS/?>L5cZScԺ}7JڽyuS:C ݛ7wG Dx!B5`!nBmJv6VmSIhi`uMŜ3z9/ٻ6r$4%C.3`0 >mQNW,jY[w0j,Y#U4jO%OҸ{lߵ{cAHCGCJ9ō8`1waؒ԰ƽuH ÌFglO[#:ɠ;_\6;KwIyC~Ӻn |1s?M]wJӋԫD]S,@.B(~?oyFazd{""ܷ/ȲD[r5嗳כlduZZԽ'ZpK=U#BXYLL #@[0b<@\Hoޖlፎ;Q>hˈS4TT;GgYbX^l_NSק~M TGg]?J šS$ojLO-n+Xt "P $4$4R{"=!^Ϝ8:gQ#:K144l7Wne,,"%1,( J3KЈ`@QLt2imst1霦Uַt־*Ţ"FGR_3*B6Y2C<ٴ:_*UK]yX9ϪѤzL?9>~w?~8;׃68zm.Ok(i-q9~~n.4ËtfrkO+̐N3Zg ࢺE3*2WW:,Oy3oHU^4Ul;u= 6hW._ﵫEɡl]҄()@ ;Ҏʓ ܗ wϒE[q$[`DkfYYvAyi (1D2e%P4)SMϑ{1.GԸ1G]hihr:iu2ed:55A*vI3Nt*bgs;ɮ΋ujg%jSj?Ƙc|v9"ITy$"kY"M5#=rstБ>Ado_/,ARCA*AC @a -}MN/LY`Dq|ID3`0uL!x9no.xySr,F`łȩX3J!n bER{%HJK'NNvtq`A5lt] egOٜʬC*1=׳n/u A舙 D%Vi(q(E03CVh[ti#1E 62(#*$;A (UttYǤx7=MvxbU.Z+l%\+zfyfIa__|GJmOK#1MaCF(m8< »lʿw~B\17%Rk+yԑ9)\O.M a!D Fy\č~Gv}*{bSn=. X_A{^ N#LoHy 5%#Mq7{i.Ӹ7 a8aPQfFpɠ ;!S S`k$,0.SðЃg84L @l%PJ%WSqlu/-ds D "0G不@V'j 3|ozZMOoy7C[Ś;M&F/x R!*j)zӉIpݸqQN9 YeR*xfasJ#dyHio҆7=P 0} 2^S_'"Oq`Uhbzm04)``E`IaXMjxUOoV;HW(YcXU?J՛w&Sc|:?P,ΦY1:-窴HeY(ٝЍSa)̊aZytQa4ɛ*rn遨,nՃrnOK2- j$"=:~i{9$fOjoFb6@UO6]TFZ՛ջbTڪB,*oWEK3Wp _J߹q[E5-ái]W\ϛ: @EP=~N1rL[ӗaޞ:_b?&7<\ߖUj2w3{-S|:۳5[v}]#QMH=Z;yzLu@u=&܎&MK=>`uʠLyr*iDYfMC E,zt}D];"ϼ i`ڲ@`H"hB ŜtR6R8aNܚlPGÿ?O:8Ro&p\f]*tTqo3r<0ˠt$Xֻvc t4?]͍p֗; %q۪|'yM @&ÅI1,_H  ; -&gRki{}>~ՑƯB 4g YPJEB0ɟ]j"-4(HQQir 8!ȹQjEV)ݝR.k%OƮmgǾ\0oMg3FtP{^N_xSH*M}`W^o9\Lg7Oɽ x,LAN 1$`DPXґ ,I^YO-#^ǎIf%cE~߷S2@[7p[;/ezf`рmP9  wzJn`?O۷ܥ'm>g{٦NhB2b BG9D9px{tw 6q6=.Wzn:/+2/ۛ 3A[0 @c̽SF;c*K:ۉtxޔr绱GwAwRw=9B)7~QLǡa6)$W2:?c`bҹVpk2<贻ǯS(mlubs;D?.;X@$*8ʝ7\MB"eF@(Ry+Ioj$+UW=)g!eyK+oC6Ug_$Cd~5 ~}mz7O_&-Jb[θB\~1E[ASrk T{5$`;Vtl3=VRR2sÌ7`SAb$9&JY3nb)dW[ӿfWо?12+%af "e\2^,dlI)' ::Nk$nMi˙^dF%xГh.0Aȳ 1qVj:/uuPݡdYjLO=Rtsw=.I:[!trT{GL[?5׏mzSJYQYarA8Yj2z&5Y!@5Jl@Ȩ1fYYVR F꘭ѥhrFST2Tؘ8ki4iƶX(Bb3rxn 3^=X/ è7\* >* dلRV0'bhև@$yWT}Z,b_HP=9/)(Fmԙ|;flhZC,htY~Џs.vSڱ-jcè-j vq$HĄ.k%ō x#"J')R4aJ$! yA$MX 'n"HE8,Ss acکƜ`<DlL?ED0"[D\\lD^ rP Ls?yo3H2qCpƘfOX)'lVDs}0TRuOG- %aDlL5/ '\NZgcZ-.qQRHcb49(yzb2GjK]#}\2 {t%lqqx0sSڱ-ps;Eƅ 9rjQlqqEF\Gdϥ!M&vrYBC18eMR6qGaVxU#D6jŭJh#@`vI&xY0Q rT'؆96=hY~^Z?p4ѓYxG7\*p.S+vtzi'$ p%է봯W:|ao>o{=!Xsw.ׇ`WWu:V`<|ŃKWש vw(ndnq1u x>מ[7(8A${4%K"<AEʶDY"H DGWE`GWE\W$8t*RrՏW^͆~ow0Uj_~y5&k&9~Cb;LIqD?txoh|k$3gҞEKϾk+apN Jq=&i%ࡣtr(㠴F\\<*)[Ƞ ypLlhĵ:\)i+ШG Z,c~ ;׳r. |m2Oj7U=OFR~3U{AF`JS*ү2c%Ѩ|t9,̬"e\ȭ -dYkTK3Ol1oV֘#,hf"2 iEJY໙p͡& oj#-D\`l7I/ҺR\p-\m;H4i0OZ_~3. wݭy|PetF70 xQ^QP;M%Vkhgb5 8Ӣ6K_a7x njCiebk X7uLX][gxٽ&~z0hw PKjBByE HClV=lBݲ}| l _=a(vUhDؼD?-2|̌lR~bĻi1+EpU 5crYD@ *'B&2q.3qE`H#{: OzIe{8=cɤ2|L*]v-_,G%[tty3OSO@uNgRz=M4nxhO mFY-J~,E]G~*/%bUqWZ>DJՔnNZ5W6:xثR-^ Sg+vd'V,^@ljv:`rBJ9VsVE͓xi7j 9Z٘8k.R m3>$PєB=?W\1czy#O7$G^tya.&raZCj֐xLz6JeNHv/"1pC4X6ts.9rE|Yk㿞u~Nο?e ^Y?S┺ξ5<]PB˞XO6pVVINK3gw\~Lɲs? xE?^%`CTR?/ezXo˪~80\sNp|d8":I=[MM5%Gfr\`,yJmgIq`7QuÇIERԣ)N" ~^εAMWtJ Ye $u59}W |j7-͇{5B;irwg|{ _%R_xy;|j=J0WK{q {߷s#)|E7^-jI-|EX63,ohcj|,XfAw7}ݫhYaV/ՅjUVe1Mm4,I}>c)o3ȝpxC=z{U/wB/G޼z˛ۗ_>|p4z`M 'DET]+_^0iIz)J2CN~'y G,oG M5MMiѷ+/h_k톫XC,>O]?QZjIR!GHp'(@̚3.2}PQ֨ldn-0BGHϽ0?q"qrOY2A{ RVNZO9Etf6mtTVkL'4Է|r&G~% ZF'P ۙJ>uq7ܟ aRBĀ溗QKW'^`ZbN&Kk:IY|m ڦ["p)w @`)E8'Z3QĀ!9oB~R«wi,tb >u~5|Wf1lym_ꋆ~<6/E%gaġ? ;#g3bZ@FkTR=Q*JpA395[:9I7@D;ɏ\ncN$d[ɷS?r޳x)*#vT}м߬r4[/ cįf  ,_B69cJD4ɩoޅ\OJDv{j_kHx=޷܃,^=u;/{sŭYhh۱ee+'fŁwE B$g d*\0"DCPAK 1Fag7,j`[x; ղ4$&.PB{i5v3Ia@"ZM8 <Ah*cཊs&)$|V-$fgp3-,y7DF L:aA["1dlj)t݅^"G[z{FzV3Ș5JcQ2TɥRʈxgƬ 3mJԸZ,uGġUBmD~tc l/5ŀ'7=~]pF9'rȖk$ 1 ¦%L xٻFr]kగ rfzx$$ijX-[X%[t0qԥjv"䏜$(E++5p5K4!hJBfY `kYJE2`6K 6u4o/p6I 4:FGzz7?v `` 0,nzov/obZ5qFyb>8txq'mSΛKez~JL>x5;hvv:;~k[ȹhsr$'4Ng'{գN/']=A@7tv5%I~ 됋6 T;73au=8ݣtѸ=fziwl3[ ]X9n{VeՔDΤUZu=j$c;_^5ymМ'*PTg<~tS )@tߤIKqsw~^7av5{?Ir|v2>M,̧̌LB-w{mg=fO`:~I&87+mƤ턷XNM쬝j͍.rvNŒ]1""FDO`!y@50xS;\tJ7TZDZ:فeחSuTW_9p͂Q^)"- |-I>b$Fz@aH? |g-/Jkseu I^Ө"u@~ d_ƳvC {yѼSgBݱdJdY ` M[r&X5$ġxEdt ƁԏBpȴ)|v"'@2Z$ZcE($8YպJq!t& &2p!aEF6fhdmV#guK_M1 &D?k2=Om~zo MbPsc01ĔZwZ|2pm߻wqײX\dNPN>M3?=ou¨%-'tjbڏו~ͨei.2~ >6#4hQ6= :R͝o=ռ.eqb/|=YwmSO9w5ia&]L'l[mO+JіUd4'+>/ >*rlA& \R=dϭk|}z:}t-ܞNvsV{;:_T5Bѱ`LNdjS9I LDRVe%Ef$1::7NJ.;LHD#kC"ggdϮ^MGZ]>)LPA/1ϛKʊHS7$p$)ͥU6,GcQ@'^:RJQPv"+cC6XZ,n-D,`bd}v*qSڤ>%;UjK.c4=C:k8iс8e@%ВΠ<#)ِd!ddYJ^}zcm]7y?4֟)zo^omɽ6އ @ $XmeL JCbc?F >}`&w.r6, Ƶ)&;$.Kdx UrsI'N'3ík|Shf΋V)\pP2jAR7\KftO 4oC]?M뀬HG_t7_2ec [V&طl0ZW,毝iY#bL[ГܣG8j|QP|sT0Fg?]fZVn2zF&o$59ȵO?~PkvѳqT y䢥L3I6#V9׮7j7{tޟIA9ntm쇮'.]m]n 0vmk$huCnͨGjM[7&:<=/nبE{3g}C7}HF%wDc $NGčH^.߻oFcigs\b5*5VA4է2*#Xyr&, Yv<&..cMmV]V2.w#U}}Wz`+e :wepȑoɅ2YpGJHA*:7/EP 8:5>ќ #mht۴ȭ}j뻽yv<8^Q\hTm%= [Mn_Dѓ-[`gx|M>T9t߻,XAfܺ^>5K/XeC/۲+ ?xJj?Oi[챎f{x?Ù? #eh<0pՁn0jRۃ;ZXVVeiSVʊްhܙLJ(IEK@W7 zDb eHч8rYJD'CީM jlVeRq_݀ӧ/ q%u.Udȡ 2g< N`nhU'^dQe+wo <"#A` V#gi7S2KqZh}yqu(^{ӫ*U/ozhjygSnD&ngjf ^:/rU9'ÀJZ;+@;QzRYzo!&|KN&`ZhIjWՌښRdKIN`* "yL6gZzLJfFljg ee](]bH.a5YV/~7瓏gk:K`ƨ1ke)b d$/Yu%!'㜎9p[Wc8{> &K-QԦ&pduYCp:G,[95vӒy0Zw쪵ֆAkgu"ÄĝC>X-'p2)/ydX"g>JUC&ːyQ!]E-Pبp$EJ,S9a/Rdb<X?vՈFTF4:g`69 H Ę' &YE"8YPC􎻠ui]ֆd@gB j-,^p%FFu!mu 8[4:Xg5.U/zQzqЋk}#qrTN 3&ϖLŤQr!B sCSY#d!=#7F_k ݥQI[$ ُϔ(j_ُkG}g{\h;x3 $WS_"d$:%>/fMP7h>{{qԣgLz$wz3Qb9l@Z:WZ-T㲡||iGWǣ& ez4W ;ϋL%zV eG`};z;VNCǹ}zTFzl:ӫbw˒/ÒVw䮖"Eh}&|J7?ǦF/!Kp HH ߼1v̡y `UDNrJ08,,OZ$%tbhiGhvkh\=(ObLcv)ވ$ŶqFsዏ~2[;;&2o% wC>tOO9mzYX4]|3x=3/\d|nyߪxorj״f/nf"<6:4CQh6˜Zb6j"Q!Q |Ȃ,k.r&kˣV) D2PYV5S+'=v ,1:rt(%o˂E&fo/6$$h%e+u R'v]aA~ LBٟJ$$$5 `DSVE1ubzrMi"O}0XЄo bE9s(2Og}m*X$W'`,K#DaTQ<SCvx`|AY9X?_jLCvu bٻ6eWzI둺w~pl$@rEzSBR[=% ECAlZV-V@"-Nڀg X4TTtQEJ*)K!G ./9w[:"6p^ 56 E[ţl=t2Tܕ3ɘ;oVXhn1 EAz<5!eZS wA\B-MEΘ ,xGU`vM7k Ly*7VLCTI+նRN*=e |lglXvsl#d!>it:S,?HoO%*8*M2@$R2\ɭB*cSj6vUtڕK4k͡,h&gs:Z;huP&L(EV^'\.ٵ}b3!V>X'҂xmGҏާ^s)i{NWȑ 9?Ag-ZhͶEZτCuFÆAɻ=qD:{V5CE\鬃$IO7-br>v>'s>9jQIq FRp)o] ^'B.l0 HyIBlD&$Q S?(Ktwަyt[ is!x9DμZ oj_=M>}P+XTkmyy9žSܔ`x,9 1$qGeWewvTؓw;+'ۋžA ,YB uw:|m7,@Uia@7hd r'ZZn%jdvM9X/SQO]^Mxnw/|:*q强~S` _-mDdf[1-^}&*sy@5`AԸRq(52ȷ}L%WmoƝȠB1ȡL 2[TW1 K X6f&\N,O hB"*/ ,8씰yŠ?_p6>=0@ c Sr-al~|=t݋23kA?oNJn IT--IkJ&rR̦R;o>BjHcXfr<]2 };H-54s+lV% L+;f/+ry)WÅMo_Z>Vr3eN2PR^ڈKJ@' ]D'_w`PKB7lՅ+XIJiA=L fW:#8*_EO`KSO9AKj)=2 *!xLBn"/6S\9YWIk޺4>S#4NכG ssYxF9(p58@U_]o[ `w>9b(DWJwQ\70T0 Ĩͨz7jj֧NG˹9GwU ~UIy=Ë,?Z)ee?a1/EOc+G! 6Z"9!s3k{T%%tw>jFvV<Y: jh"FQBbўsH'2#%*1h0(' HF(Mm]1Z{`:JRRNy|O i28w#AI}}~YI$xecW3d{ .>'0GUGN6+sgB~z+hSM_4_ՍzkW&\zR/8>L**T=.i궆}>ZF>` "k%;FyR@r^Zro--784Q@0lk#=a\JzG6zix,˝=b]u^e1jeq0"Y)g "ƃWִg: jK z1X3 w0q I:{rIɼ84M%JDN6ْ E-Qgd0{yUEgǝTh. f 5\s0k YLDFOv4qlaq5L;j%^!}3KJ{=_Vjs]țA)+c&Q" U\dJ3!Dɉ zb:8IQ 52J1ȸo5ɡJ/EJ؆%1r{q>1L(Յ̾;X{OZ}YcST6 ,WYbw%/Rmu:viGnDNEѮUVAAvLA;0iB_;%IUc5>3j-!fxIݼ ڔ t@T>xj4nx~Х\w F$B0Ԫ`` ےVEH9ia(Q$ֵa'![*gP :rƭRLd> z)KCQ^ # B*|frV =\xOMP6}z7CDbJVbc&%Lk(o `S+Cmet<(ƌ6AZ`*WyQԷ)<(h >Vc~; Hzq" kusҋ<]:[/aP]ha(|R柲wF2qLdG2I(LUJ $J IA! l֑*X˔d)oyl/FD"1I,!>QEbM@P҅Mo~F79Ic1A5DTȖ" QA@YkP=%)c:>vN>vG~+ؾG԰} zܾ0h<>i~n1nz^i! #װaJIy̓eް^MPLRI0 Yg1R0PIȘ _}ǫMw<'Շ,?/H_vMt0Z;Ʒqic$opm1rgOoQ7vh"@I`|woU)hHB E}WQǍk6:é2K{B4CZ"/WwlzKGJڤ$}e=  c3}ѕL<}NOݺDGHӺ 䋇4qu|5!S_1o$廘G?6L̪:AۏP^؋V1m7zj}*B5ݳ۔>{4WxsSقӶvXGw"G_ח"tIsu]N+%;m*QbۈxOC셬mT ںT;?CvgWxo'ے5P+uG8-DVqhԆQƨ="؍ϤI]  Uhb*+ 9M$|0Ma1օH2#C NǚL"YE e1?1EM5fl~sVDfqhD4##7Y2d!:ApHϻ7RB]\HFPC ^hmՁG)u{c2/IRL1!%۳c8TGf쐸QGes:lf%E8.1.qq Ŗ\PVZh)zJ팎2EmJLZ*#"Ƹxx6;VqhM~1܊c*\zԷ4M ~ўCuwnD{s;jk M[省kk:U˕c[?B(o'{ʍ-g!A;"wJ`!LdL 8Qe{-3jvOj>5"X(@ c6*@,M E)gxS@$l`KHZ rK颁:-֥aفc9Mr]XJoW;}[mr(nӈdD @Yd$CMDV[!RQIUU0zn3^|쒜B^ I-kE+ V&q)K*)@ݖ0SZY/$2+B۔T1#Ck#ingzܬ7&CϥsPW’6' 3?Q>gFh?>/q1~PM!Ģ$G"3F[I1"rNH;BfP+ߞO:-~;oR]/%%^9D"$®CŐb `R"; 3\āEԢ1N0(s0?AhAZxfCP K*o$R3=zq5z[u"?almf"|;@E“*28{: 29\ . fSPQF՞{sG0vV,Ph!ܟx}VnZyHyLg}lΐD͏[&$qɍS'?ZY.O._~\q>u\ݗ)C;`|jhƲ]H8(y *wv.[fI9Y']I+ FT$vS9KdRrfҶyCLbZ)dEGrV`9A*h7QL*DRgBe@:J-ІʆeGE Q`eEA[VG/nmhv㈎NH!u?$^MAW-ru5C ֱ-c\:%%'FddEDv03CzC8ڡ,Հc}n!4tHj1gxU61>E= ]<|=+3O'/<cr3>LMՇqzۏI|b1A֣}[͟>\/#:Ҳ}Yny^ٻ&v^Da˲-oq~ؖ%)O448ͤZdp<3^Mxy?h>cU`DN!)S&#j bĊ_{)Shb"ՐpI%\T):T` cե؈'It}Sm@)Q_C4N`ި<4 ivj70ZpvMʉ~.DЇp _oW_mMǏޏdjimBP*]cN韣7vl7jO Tiz4$,ʄդ&@DMիlXg_>ΡJ='jvm$Fn"1a<л(:J}L w ^I]ws?:f|z\ 㷗!;`_މ"G|-F'2iG vBL ɂ ŶK`ucqvuc41EX66[Sw-q.Ar4ع-H-=2ߙ'v>[ChhwyW8rJ )f%]m 34 L>H 9ވTu;" \b4TG͌W~mT@{LL[aL?ϪjhT~!gD!$ %ru.E|TQbIBD&@p6( bw R*V{ydI87&̱qek=f 5:2qV YKc   Ϩ玌w

]kl5 hBgƪ1BޢfȊ -j]]kՠQc gS5n16&9q.q}4"/q,|T=%*^@kfo L^]XE؃CɦhlA!ͣIBLR [h)lL0-^6}t(\Vi\sf%/5\XFd!͆\,xaJ1J l-olqV% f1 x8yu4Vr/d;B༃ }ܬI1F g+APT?1cU?ߛ{ Ov@#R\''S"SE=ͦK.e){D6'[k~޶ђ)ۅ~͠e;Dہ =#%0<%>]sQR:k箤g iQRTF{dOwG<džVH:)ASĚC2I1,WG♳~' 噳3♳TS ͦsYؒVrNOs@Mc̅Pn6O[u{6c2U5Mީ0fG`=o?t/ZG*=%)&;ѓ9~twr^1+./PÇwyMVau}gOo68Cl fGWOeJ}7ӿ?*y]։FhHm/ՄWg )2e*dDPME V|-*T+ڠa*H"vE"\R562d|nSd,󹺔D)NRPBSmzFz}h!ay~i" vj7[m[pv}ثMN/Ep>'>Xɴ _oW_mMǏFICWlH$lBAzsJ7sA]%Q%E% \ZdfpO,:I&3b4;yE?2K!.f [_SR[q*=j(dA#i5HG@^ns =8%?D{AL'5qo< ,tJ섮ݾC]0:=wAD*lt!V O% ?e[jF H%̜HfG3"RH׈R'ߡ#GCք0HBIqd&MeP1Ac!b5P\!AJ\KQI$ x, ᢳ:Z>gt:4sVח љ({ 3}8t^z(FzoHm!FMgFF@"Bw#u`Lͮpa;=}3ٻt9R'p(40Tť2Ku0VRlh0X6]! -d\zNE 4grrlt_}{n1 U? gKJ5 -P: T mOM"&։GV_R_ފoq7**~|1_VGEmxК:lNAz{9'%O:-~;n.R."joD&firǐbJŻ ²2&p盹  ͙›/e }GTp?=p69`^ٜJN|mj5Q+ѹ}^5Kw^ܜа5ҬI h5\[l]TLsq{lF6.@Ʀ9iBz` 7Ms뾰|;C6|'S~nD8/HXMNSl⍉Έ̳O8gnrB橳%NgByz㷱>}cTn:yVFL)M̤5O%GM*,ъъъњC`۸`ē&e5WMCk֘| eh&z4Op}3h۞{rWAt/\$y5'xVqIN̡3^NmN"VnW7#@R`QL &=?R8U"]Z~IoYΖoK6D;Y'v=Ik/ό,mR@Au1~Pٲkwz[a*޳4$Ww[823|i級~'Xml_1c 0t!ѴtddV32ȍ&Vd9ʸ*$GKV;)2ϰcOUSEM)f0YRX,@:yFn(^Ec : q^DP\̘`䲈 *3nwȓ3"1A0e:Cg2gޅlrbq;HsOF ]/`2iBx1fs謠]9%e3C43EK(>R9B|D-L6s$ZLfrĜU2u%$+3fON^6ӚspT9@׉]s?o@`&%<>{6fbݯ=,5]Czݽ$yp 9z7zd {N@ ~gb(Hpָ;TJ23kzz1Y4$i3^yс3dtc&am8'Io޿0SMTŤ^t/P9ϭ eC@Ѐgc!xr;R6Mrduٍ@#{ !^߬q1D![I 102YjFRW'8s((I+K{1p1K22}Tnd2)`G }I$eNҳ7)c[)Ch?σOiRװ viljI$4uY:^BۿM%Bib6%<|36m Qځ)̧%hӲ x7>=v}$bZGr0 MO%2T0KͣhB;yfuF"yVQٕ)/GejQQA+7 Bt ulFA?٩ aCn1Hu| ^o^)µݰ+m 5< U5U%q Թsm՚۷6JJ%%-!$ŵj;;ͲٙMv])  Lo/'i@D)LV8<^(][aAo12oWpxT,`6j^ ]l]o=׭6ʂ\@Ķ VBtWUbb_RE t+S{vsr6O'e1:neK~(4Qw`DN&{`^t+]7c B'F)LFx5'U{ј6ʚ-j%J:vwL6 L;Fb.}~ |sK-w+mn@*]4OL3FPz AdK8Wgݜ,Sώ2 :~7s"&) C .{iCZ8H3g:aލ+QW:+5?:뮴k=½U GT]/2>AxEm –6KeZ,l6KK6K`%[Ѹ>\fnJ <ڨ̮8Btn= 2{uT]@o7, 3R fPm02jAmTIC o^vy@;wE#wBBye 74$>Bq.T}№W U'Y좹Hs*LW^00ԗK-2"X*cL1c\AVB9b"_$ _ 萐q! $CR,/lF2h-A 3K̽˵ rѡй?:D&@֢K#YUtsT]'}gY/z6> hqn.i}\$UmԪ+'GgJ2,,0mm7q Нp% q> w۠sLH8~FG H⛊L8_I/HT~Szqۋ :Fh%[SKd^YY)''J ĕ'`KQ;FzJSBF3@M;S:TT$&K-D& -˕4#S+;k΂Ɉiܥu[c奶[KޞA/[5 VFC6ɑ 6":(yT)bܴpϑ*hX9;7/ܒ-9a1pZP1 CِiьipFuW=k"=JȲj_Y!Y$^22AzCIg!Y{`M3ѓɌTV# (2B2<".RNe #M EX>yah&ra[FnӁ#<(%B924h9ޒWO!3A6\-U[.⢽S ͐Rɲ_ ?8xܸD)D1,E.pٰ5@|1OXNd4S]G4h80sV.D?lG4Q< *|$NG$0Mw* ~p GZ D8pY:˃ٜA۪l(H˕`=s y*/uuEjt%(6]] ɼVfj9u1&ȏt98uGԏ^-޽ONoU#$/BaQ4qqJ_-,~NzG:;2:?=X /5zY~^|CqƇu62\_!s⾎&y<::^쭰W{L΋} [m8MCE6΄rL'mi]Y'OĴ1M|\Trq:_.7zv5d4cqVY9=rmhVJx?-y.DY>Ki1dl0]P@į `A9?2n?߇O?'ځy Ph~DvsFa%',s:^BW|QQŅ'q_,zRD<ڐ|i'YWS|˩mt l11].e޿ߘ* '!.iꃗQ Rh^6&2׭HI}$߸(*x0kddAqF`#x)\ ]#=wa)\{ǀþ2 e d3dEpx)z lA;۟tr_5LkOwQdsǜ}*ȻSyn,l0I)mEi;dfbkŎRt"01*y*@Vy2 )c?o):|RtxṿE$* N2!G(Y%wE@5h4h2Z#rR^$D>o,'xzop]s]s9Wd>9#Ϟ(k^#W3lɚ/JA Afl΁\dDɕi`T<%]֎X9_./*=ltB21#٢LyҌ'PKp1#Zt6_kq`Dd'uA!sg䤱 giII|n y4uΖ|Q;HZIR\ TByDAj(?!X--DJFڐYKx+- ?=(]ڵ$UH%r}̙-DV. ՙ}"I/kaD %C H""dѠI ebg+6$i옗2"%GW{uXMòv`LG'<:ShgRF1F"r4k=u!JE)@z7']Ⱦ1AbE &ӌ|:ƤdHl\e2\'2x+ceUazz/cb|v/3lsDCzȪwH@ vLHM^&D2k% !l 5-h4h|--h { [|-h4eXqKL~|96TRs@ 1Wm$GE/͡mȇdp/AW[YH{gSldYVKL["SSZ7L6N-"UVIF{T1=)}OVpJh ^*J)IB'Sf!>QhIKSjDὲ(2H%X5 ڠCh[^#*`wItOZ#U!2+~Ot JeU/f0Q[rOҴp{ўiĮ-Ooq+9sL}6tbS"Hh$.#VhGL" 5`DRm s[xشIp~92nN~} ?Ƹlw<b8\PIR-݋jSŵWA6_oq'ʣx9 n/x0y+Bml0m w7#}~n~On=1<n jېxv8gm0ů+Z/\ۊ^ iq#y8z%lK607X19B,gkH¿ā)KTItIXy͂>9}Ѽ|U(6 x\]?_((Xr L B#vA8**+y>OqԌ=Qc))x#6$'6x2+I1"MH\JN)] d QV; Tx㩎F&XظDphiJ8$YPp؇\BWl>=Y˻CBN= qM-X [%z}ۭW_ Jn * bX * bX VfZVA_ * bX * bX * bX * bX *uOVAUkĚi;JgH3e6YEC3^J>;$Y(ךY +jAk]R *q^AT0cvƘ z ^ R~L+/QaUwƫɸIX0eZDN6Iw*z}촳9}'Isତκ$1ԀfJEH%Dm5C0ǃ@$SBe80(3Y`jDzM΁橳NzJ5r_O7 >v;2f>2\ӭy/}]ʛblV$(M"3PEu^>BZED ę|HĜ,;7T$tĬP*FVG#$N^Ҟ-U1%TɓEj goY8=N&Էf]efwhǧh}~E,_dzԔpPo ~3Udp+UdhAz'SЎ5!S">0C@ѢaF; [1&@G+k9|x-c4t] }~ןmqYާP^}oUˋ˅ dF$B0Ԫ` ےVg\I gEm& 9h]b,wb\ C5 `tV).]2IFJ&z @D!(B >P1@C.>ja3),CsT9* 2j}ib9W-v 9Z.R_

:8 GGQ>8e2%9%tV`GO"w2@iA )cx&eTYPI95NTQE+4Hec:%)by2\~b#ѽ-Ǜz)%H@=ZL.A`\j(:>ΚL_֟$q屙ŕUТGg%CC(PTڳ?bR #axr܌GO)4H)6HMK3a ͹˞hǛcZ; 1ڋD(EVD!<0SVvC*I&{ҿL>mԿгLBiqUjrcGʀoڦI{-kۺ(37icW4/Az5eV\0z=T׮v$W m\w!smʙԋϗ-|yzwܝ>1R1LsZ k%?,ԃSu"r1)HVrc v `}t ӱ}8{7x[Ac"/C. vdasG^ArѢ!b{UGokB@]ʺTD=p\PlvJQ;ά(ylBoO=͗p;N}E\?mcD]Tqmٞ6|OyH~];E Q3LWb9*%ErQ} 1*gAHH4tz% GtzӬ@/1,<8'r#%˴qXVR##Die\4yWHJ]`ve~G9EU]{ T|W}׍s>KyygIÊQp("\KI;Nӯ#frh_wX cGuGr^c =Nn!}OmxNjIɦ8P)kad"<9 ҢO !Qk"tD栅ҵJ g?J]Iyӹ#K= m;W+]^f/w4/>ģ3D-I!ebHTH.s^h2M%y ψ$ATր;m<83?=QȄQ\Nup < USNGS`jKk$;$g!^@@a]y2Mi $?X9}uYm)Z#BiNSP̀9ufZ%ka1$UWNkGU+؟wch4-Hpt()$A2, 0q ʹ*$w5UNIǦک"[u@>#;A5J@f+c7_?[?>,p2dVeV$ ?hw?L`!q?m05h#z4IB.Eay+g1o~"7 ye42;#/ބ"kv:Zӏ{2\,  lc@~LE?Oi딗fYD(y@IbJ(B( M@IqA8vlZe8|I.7%(z"䨧!E7ITZ::&!NUy(o繞=Ӵ'mE{cotg+]OǷ5JߠF&l-?jG_.F <\O򢀣#R-,m"Ԁ'q™&1paNE]8:g+Sm\lp;_۵-&WW_b/H\@F J=>XD`` ܙ-,&D @3!7O=p}v6oq)(VhW0K- !Ago$vEǾWG'E~ ;3?Կ&Q l2tfܻ^[iCAxݴcV ^ f(J? haG-oBfg<)}v&Ac؆2ʴt43Vh>y:9$}Bl~UpO.aƟl' vvt|#.D1\C /4R(8c `El q7T Y[ R)hJ]2 LNv]|+q#v8y0ݕvl1jSڽr+utbDŽH YV*:\+E1&tBuEic 2{@̢̆&e5fQJ"23qaKg9g;ӏmQuGWi<*_\T(lr ;A^1 XxgQIV `*8Devׁ 68+6 !,gB RYD䑌(3q6H\~yxvh!:;ӒmqQwWr#CY krĿ Z3dŮ@Q !J@ |z\. sJ;C1nYd$6xG|$M ~|G'+qu7Fҧ=>;;nvyfm2S:~y=B{;1ӳ{Vl9zS]Y@؈e[܅'gC=0C̞(9Z0I+\Fަ.~$9ru>p &Lfu< 鄝S|[&sx1yx>5cvn^pdrJmo~]q.jpǚ(}5ʻPp&J;sw.l:̗ldZH>erd 쫏[g_댞'/+кA$ }:mzo_v2aO:y-/fn<_޽=L^~|~M;c&7=c+=`6PI( }|IY}L]xY|r8|~5Z~1.fZr^Y{oMDL")"$2h EpYM@#,{X thc)Rc3/fRJʡ1"yWVL7EҮ7҃],lYyt]s,[%rh>'34շw}o/Y R{o_ehuíIշtfu4ܲᖳo{Ow^hZF+7 ;omŹaӛ4r-M9{YMQYRv267LZO7}VT? nC(QU؈*W,Y[bi%C*t=\=Ch lI \Uq5 \Ui W,=|pE%U飁+P \UiǮzzp6p<7cט4QrtqVNvy/Cfٖww^L-D-X38Rwm0V6/ĭSJa;}woߜd"5(ji5^eՎV7xGrwXkl)30ܝFJՏ~qEGVMH)CCTz/)"#brU`G䪸N ciܡ3*e,!R\D'ju,pUuppR*=\=Cd S' lx*.c*C*e,)P+X?vWU\s4;h Ki'_\ ޼J S&.I+hä+2=\m=YuGA' yybZ'7mdZɝ兛۲_ݖZvLkK?yB7'YC$"j+#1Qst>IJ)E#^Pu1_Nn @VvVf]kn演`Z >|Vav7woێ=i6ى_Q&^@U }q`_\/. }q`_\puqAh랛5jD_weIeH rj$ϘCPBX[J71ΗW;Fzn\чӡ{<TLyA2=(h%f-D' gy&/6@ MH(CO IɃ{{$L*fXdpqcomEDXQ>֥_g*rhƷy0i3a \s:#lM("KquR)) b~2<<3 :Lacj\'ϖqm2SN˥KSNZG&d'J6QƾMfŸeESRo/K)xJ/ogMuKTO䞜S3ݒgyN(;Y! r)EV" /$mr]u  &K.IT>JsEP\a! M"6 m,^g_ڻBeWZ͘5bM+ yռ'Zͻ#1[]j^}06S/)*k'Mܙ\ qe"Z4|&@֡骊zquVYF,^:X ut^y:ksE3DK 1:doOՠKme "dRCԪhAG4K_?O^#_#wKQ~{:%{0|u^M~su:L5Qmyl eɈ༒ʡ HYqazM8 >,k܅,K mPB˜$ P1"i}k;:J~IN5P:D&X!1G.b[+&"'\*-9 -.rZ霤*rclS$ 8U+ɉ᫘w%,ܶMxYI N;>nuz!O°M("f-'G ]rhs@ MހwK |Y]P1B(5t3 hcBYBGm` @;#"Kk=xP@z{`}maM$>^-W:47 xrpŌ9eI)" ^X0S! !%p #Ei("#>j &z 42UG)8h&hâHY9zV"5R&8R7i*\#|r\zgR,Z}* t$ƀ?!\,Y#9|5(Ws Z}05Ћ7FPٗ9L!#l7nV?Mazvv E^LWe * *rCL/0~Rvv)_F:?ɖ`ߪ9}VStdMg+EL|޷XZlX{̜=dWqJ˳Mӫɋ5 9tHq@{d%FnM@*@@] t.@@] t.@@] t.@@] t.@@] t.@@] t.Bp/EkwfW5_]vkDKGbN .H7EȏS╖Vθ9R9'EJNQln|j Iu"E&Zaي 5Y x#ǐ|9ZMd[ljX'̜doG*]ǕG)J Kfi_..ְb`Z=ͳhw_b(}zSSacup=Is2ބ99F 3볰YlJVk^Sr0`oTacPT[eU}s-!MV'>B^< u:a=,Ad1Xg|@1/?S?*53jIɪX:TRZuTfr*)Q:ŽB:Dj;^.f!t8HTu-/]ãHEwU2ຽb)0VUUÇu6-OFB׼ "Gbz\N]hSxeǣK  c̫/;9 'L8t"wGBZ#dO4V?$S&`PbAk>bqBu.  g\2qtΎ>K.QX.Y<|D<;*~n3LT<$J TjrNso͟1n~pxOTd>{/B.y0Bթ< Ad8bxl>^ִY4 3j(u.9&2 |{kW'H5{|7uO'ii ?KTX}f⇫˓o/rkY}6'zlo,LWkvpOϊ+Y6z%w_l2b2tC|(R_O?ozy}+̝Wnnr쪣7۫.i%lȠ"ѩe O+Iݍ[>dIpy#/o\o~@F$$14j?k@sfi̖i?^Z8_ΎB$8[)̖>|͓b@.m.-Mz{\׹[S**[9t֛~_71 >5=FiTu94<\͇)vWdnnN2,FOтFPI#2b@ݠw/'erI4# '8'D\rt4m wIأ_=_OEZ|x~AlS]Zbsh`K?q%u9ۑFW-&I~'zz'zz'zz'zz'zz'zz'zz'zz'zz'zz'zz'zzykH|I5¾v܋a'z5c \5g!`.a<s%TG^ N  CZ)ztⰻ h݁u}]+b0ܩME{ Py0 LkP#@ ,*$[ʶ` eۚE }|@k?}'췵_pSjudu:=umQ~<1]*ߣ!C)p^0n L)C e(=}daV}hf<0ͩ%>5z֜7zqf3Y嗛 6*rj ,f dXXίNrXii0G u|,E,F5KEmӡfJud3䄆6;g&BG FB(d,JTVw>?Hj6oՇAS"VMd2.+>3|| ZȽH۠K-%EXIl2D "$]ԸCeFJ*l  Z"80u 5.I.Y`\]V`yFۂfuop5؍n:V`XMպ<&ӯx("F: ܃ 68.8\!HprQ\  OItݸa$ ǛFBƑEVc&/k L%N.4C`Xjfbc&=9ILֻԢgn?rD6 (R۲B42_iwv}uEM$NGaԒ+n4\RLm5S. V"<7'ar99>{ ` hpQӋpX 0=M?^MB77~(_g㫫!u{|zY}Vq2>fao8\M4雟SX$xŴ Q* pѺ6_$Wc] 79p91þ{H{-\u?G_Tn%Uۈ޺FncwMm%z.Wx8p݇msEz+ƨGv+w.MiF|j\ۺ˽'M/YQ]ZN~>&OG&4|`>c fw̏ 5C=(977wo ?>r#moWmWwz1-sL6wkS=˔L]Y\v]#Qg.;4M7ytWMmErAm&kkYo7tmv1i&jtFs3aţu'˒ȱQvmvݵFnr Oۨm* L[h I`Xb)}Q@J#r'lD2e8F;ϋSbNLCl4E\HPm65 ֖o^K̇Pb_glraF4b{QID!0JH>KQ$ѳ@2!Rtͯ樦fg3 !O \[E?Sb&_>x#& \A> |!IRJӝ\WC3 dK$%FQ96eAßGRST~貓!PF4b{";ɒo"TyRzG1JL$mBk\ i^ޒKMq@ӝG]: sȬAl k!J[ X(Z,2IY-g$Z&XW0_OG֩SFw> YolW&;ۮ=uk#?ruȑT5c&t\stOö?䤫/m$K&K=:iӞ$!,NjFlOi9& kYÔe9 edR;tHnt.m&I`R}}=ʂp).={9EZ{OnƢK%=:R$`؈g,&tЅ)X.qEV\ПHXG{.%3*`nJxkZF4U(r¦ܦkY湒M-eR2Sg`*{+0U\Ei.vx1quRe\ 㦙r0NÒry(F=xY"i9MiD٦&rU[ϔoaV~m EU3hO&Ф8߆An{sշчӨ'G}U^{\R}=\|4{LMzwDlGp۟:w/rK~K{⭪l`xݰ-s-VB^ib+K\[e"=I_OmjmH_8i~=iTS')a"ep9|~MWsʾ}= Ogߟ4i s64)['aL 5E lp0G:Hc2a6q6ac]d0 "fӏ]gFD^ :'D8Akf% +Sa`,3k˕Ry 6QzHL8cuDSdd0I%̈M-,.j 8׹=lZ+.̸( .\\kIM*eV =#JK*e0bBe& .rΥ̌;m(oȍGq ~rZ7g{ '2ڌ侼f^p9?| nLiˆq5ÎՌGRkET͕cV`c^&̳U5ZT:o3T9qyƽZngM,:X2;]ZkLP7_&7y z쵲2a0XH'5%a4UTsc%bD<NxQ>F#Wנ;nS[mmɬ#ńy#>xSMic7M yIJK+7#n_.oP.WwNfU/s̶/VqzhUx6+z5bg ą5Dp$tJRZ5•Ԅ,T (0L~sKzI Vm2߃'96G2sQemOiG'7OGgU{²|&NjlC&{]?ۿ!w+yxl6X, v<̺käkwjUcͪrd7Np>He# XZF D[ujp2xƿ?j_5ڒIU m 5};Nz2Wr%OP2\:FfT,6˷`T`*ԘV /?{g۸-пB4f(׾f!  (jk"KnIvw'Ri[rUfMvxXU/y4GޖwXv=hjjivtp̚ _%Y=R (T`m0ZJy )ڋ Lâ7=5?wf힥ʉӖ$Qr㗮LվEOy FWt(BZ}PRB:+ƴ` ؒ`tPtuR誋H`tDZn|R*uA] .CNR`tPtRG]uQWkU@B`p+ޏ]!QW]ԕ /ɔ3էgJ+'[1Bk9R\bLQ~7,,_9k^R*jD.z<<;(]ӦkuyK5\=TOeCb;jtU/w *zj-j %bR,MkZjFFΦ W؇+ϦrZ R܀EO-2HWE0B\V3黮RD]uQW ' ]1+ qM0ZʘBJz]qk ^[> uQWԕ$~_Wh8rViuF]uQWj ϖ?@!2#=˹! h!T ƩR!702qM0qr}܀2t1nB 2BT0B\)Bѕ0GeRبHfHWl[B 7GFBH+QW]t4kN3Ѓ+b%`@%ͧ2YLg戀o W(Ue]tqX,d'ԣ>}g^cM FN.Ξ]W?7M~/,I{avT,f\LHޗ{#@ˏeH%l7Vh(CjGOJ^(Mu<KU-:}YrZ]W+U,f(p@nz6*ZjJ*'w^Z瀭r1@br)i5\W(!pω%o樵rzESolڃvjhgOb-YAWsVχRz,Y8ao=QejHid ^L=(x>+n p_rm,=GJYϒ9E][T1.U@`n<]!d )꠮R~OWL FWm(ZF+kHWlI8B\BJ黮&Jh<̩FWn8\WHcgG}ZR2>U}p*qε*.„Ul}>7\C4ЪFv:iKYGNa'mfnk;sehTBƳB<ʩela-p1jP_솫xBZk}%*h:أFJ Ow"]!~)mE] m8Sґ*o Ө Xp&M! fJ:'MqoXG3 nmDhKGQrttţ-zF! b q^AD+OBJ#z]g+N"]!ҡ i]W@I꠮1OW, FW+y(BZ}tQWѕdgUWlaݱtB+ޏ]I& 1QW]ԕ[R }/= WDZs_tT NNB4nrrXpy!fTi=n@J&@ƸY! 9 L+mk&7J9]ԕrpy!. &BZ}tF]uPWV+,g^Yzn?JA#+ѰE7 orlڞ鄫Hѕm[Q*QW=Vؘ`tPtB+M*ꎮd]1ɉ qɉu,ꪋKXrWPt+&ꪃ2sjVu<]uQWRqJi@`I "&BZ|RuE]ih%,6/X(Eعyqy L)ʙ&jiHWt(Cv_ F|<7ɩ2l8r}R,dZrpDzrL '̈Pzr@˩'<ؓӚ2xB@Z}RuA]'x稺2Zi`x-%GWHdUue+ LIG\eBZ(:*t]ɆE/{PfUNJtÕ-xFkZD}ۓW:JF][Hd@`O7\ECZ⻮(/m:+$HA5O>N'B%E*7f1Mo #W0Uu]tqv}x\ U|HS1Lj3&Ҳ3R2Й8Kz7}) ߼?E+|q-.*/@U\\Rfl2’R[jsA787Cն:srZ@20$߯_reWYx̫~I~T>Y2̠j)٨'+6qr׺\1}1W nr6gD0ΒUG yQp6K, dTkJ{:6𫤟<V9ZL"yt6{d?)f[0o_JtjoqY&|LP)+gxEx@L,X h bHc,Xkbw_vOʼn_eYxdl_Z_ͧB&Uv Nj8U'd~1xUvݰjRYiBN ͮGi 40ϾTl9\to<(-Pk%䇞,桻RmiKiΊdrv7Ҏk{Xe/Pvm7W2{=?-RZeUΦEg|]<س<[.A U {ZE]W8j͎S qrf\"[Ytb#~*_^R}6t mƍ\F)j%bjdq}\6d5O#KB ƅ-#qDL}G Tz{>)uR⧯qL9KΎou 7~hr#AXA WA"=TOx9@Vˮh,:ï*_aEGv8Z"NUkÿ?qֻ/lm2߹C@Ğ=wtۥSviBvIy Yo e?R]pכwǀm~Y7RxksGw?>Su6o:Ҫn<8Z\&7JO~Y~@,%ֆ؎0g͠vԼxթM}YXY6z>dSq'ټL|_GR(}|oˌIJ%4ee^d0UBҒl!pHˆTaIO7Xv_5>KkTb)uh2C#zyr3o-ן.+DZ] )e\~p(0r§*oC&Wx3@Ub/^x \dgJízsch Xr|S=jrs K`hw0O/f776~W64 (˿jv̻!GAeժ~*]4UM(zq[Q.tdeLs@E`/c ?פb{B)X".>6\`]WY_0/B~NW87)yjl4\Z'$K%4j)[ŷEr 7|HaRoV~'B,OIj<zZ0MꭂY -q A _i*E QfԀm ~M&hٴx~OEd wŲ_LyQ:fZ:4d(rS 4,/3jl&83Q %39(z,TХ6 eY "K() &͆ʁv-<wD|đ'%E"?(|A7 g/܏P{{󇓼y.Y#j3~>v0 XaZ֣ݣtSOx]?յxgC-PՀﲹpJ3á,P+%"cW2  Dgږu$`^ KR$K,idU@N͆2,XYR*U%ܘD0Eؖq췌alhq-t#B7“‡{ڑKL|=z{qշN[ PIdZES]U\J6ےǔBbi\xIPC2qn6]5` GP0ML4m9-v'!dXqƑ6V{F$6l`&Mk6K# _Jgk 1BޢFȊ -j]]jkՠQm~ gS5n1hoA][,ιu88|kq$|T=%*^@_F&*n"ơdSDF1!ͣIB#t;STZz [4.GlG[~.jorM)9kv1vq;c̀JFC.`0yC6޸l`pǒca LCnܶ/K7.VEc='Ջo/bO1wZIŖj2eEpsU?&g;UV_.{<h.{he)5Kx 9lI޹8O\N_{@$xp$ !ۄ0@qàؗ %:M_)Dh1 ta/..bXMqýLj TBP4e(@M$m{ ݖi$2Zd[LhW"D..dJLUш SYDءM C΍)Z$hM5E(!MT0Fd8! |n㗷G[m_;p}uq^?1j51<}%U%%5mWFΦ^Ҩn| %So6V- M?rL̚2MQIL\'JlL 5ւ٧) 6˟H~0M6z(b-MdTKQ5zDjJcC@Qrx1*дVy^-c,ұl ]s!C.Y!8A꒘ rulJ:Ok"[Nk0+wVlC[}bG2h޽x 4OiiKn2B2Gle7y@ZY23tȞf6#~bgms|FjOL^1a#VH:!w"Q0d`Nģm]qTIiyx$x#$g6bn8p.'2VR<4hͣ=%p<F/pfL‡O=^S"w.~n-:w {>g.CtqAG:zy}p?z8/oL0f78/rB95zqWdnyX\]o}u="1>S{&^uNwғޱ _=toKl|[c^:~+Ne6oGK!lq]͐/ŀlxWDnlPrj ^Ҏc|;v+ڠf*H"sPpI%jŦXO-K@ [Wե؈%JLItzkjMDԻ JIi1r/Дv/zK;o|Qɼ>M7zk< }P0vS?FQ ߧs6\/_nw`礤AN_a `MO \ZdfpO,:I&3b@X\T&ɐR?*Sj @T3z.PȘZ888v@so7qd5zr?O/qO5?۠9C7Fn%B澨)U2'|7$G\4Ղ:Q_LrG5'9~O<}\ЊdEٖQmoD_&N_D/ L_9 74,XPa$44Ld*&(u\J;(FD*)s-F%"4LhEc7r݆N_=ү\t[LB:VX-扺S) oHm!Ɛ1`BM"BCU15$YsG%Ao}lHݡo:KF E=c\f*TJ LdAHƪ 뢰ׂGŘ xBZJs&'7lW?~5[ է]” RrM`B.~T kOM"&}։GV(**0 .27}QQCmXb$8z/hrh5asf5ԪS㤷o<<<,94uR:E5wR34QUDcHmAZ+vz*ю&GFkF>Xާx?>bEpW遳q h|PrkdVAʉ>qm dE_To4jVP5t+r.9Ÿ6#K2bgg4z<)hhrv|PoMu7ݱ|] !|DK  smOSڻ[" Iqy(=oja9tߖC7qgѵֈ' MT!{cX&i 1kY7pQFvM!CvD Si36[qE8N['_lPڍO,~3 t` Y6)C^uٽptd_}7jogtFCG|?]#VHAħv;Q$fpΕs Xr+  2^?6*aào>|\:8(HAzu P֓4:E' жAmgi"DCe˗gcz ^>Ԇ~чogízn?>|xVPec۫7u喤T_nj| 6Ӂ߲SܸC/G3.]\,6}Ld=ד{ۃ8gvFZozSzZ|LG|E)r0*VLpǁ5os"TT#0S;Lj`J.]J6!!$c|bA:19gŚ6QstP*BD/Y*ܲ8!d5KlB54rnH(׋}P z׋"~˛ݟ[6=YCz{v? -kY /BAcq9rhZ@SRX]3$Pq;Aj^4)dmH)fjL9I@OI6F_˪TR-goY ]<\{5q"S3so+"˄'꾺|Bpݗw#đ1eRCTةq!7}l&J٣f#8rzO#۟fOE^qs82Fu^`eLj؍Hgs.&H-9n>EܼtJ_Z?.c\9vGןϗd{qJKq{qppgN6GN;QFrQdfw%9(D22[X.#B ßc;?VB[ r%ơ.b>bCe?4e-gWzwH(Oyܾ~ vݖӿn9mx"NMBll)T_Dkm ގ@h&ضhZ,h`i %$'q/~gF[q,㢱Ï琇>z JZO73NCO=3<U}'o8Npؤ*Bt^騤Ae,3|0QgN'ܟ)q>1s)IY1Z qJ*m$*#IㄢHKEYoG}gja+p%|Wj2fW?P2-{ &Ȏ.H<&qWNa )&QSӒfײ]ePj2V24>o~~i8; vb0ٸCᔾ/hg scW2HYAWcHp0JYLo^\$Xeq?éaC AqT]tELMڂV9x6hme(ت B`D/~op=j%O-^o^:蕊en.Wɽ>ԳA|j3$j<*hRFm9Rmۚ(+GS4KrmkfD"-_vo|1 X+6=sepMS ( YTP2-rO۠+^m]yV8'ZA}3+7S<&p6eY< ub4Mۖws=DZ^sl4} isHQ$|dH~4ȁ [Ix/n]| 3LVMYbU٬{ MUkNK,KF~ŪFjMr3 NxV9o@D.=`\uM)UT$kLB3yv%/xi"0/w>1o!y ue3 ݾ䓍BǨ%DZﴷ ѯ:1".ݐ^@Nʝ30{spX/ys7 |\zRJiNIԞehzXuHmaTOEHS" [Bt F9Mc6[3½ RkԠjC>X QiQ+ͪȧͻaT'q.G$!\mP bJQ@}b))O]kyXE.1W'+\g~Z&{Bl&uѫObR4[ǹ1y(뙊ꙿF?T@]uu~EKLrxN]$lS<O.,y/bl_ZhQBqB& Gd;Ig՜y BF^|(_Eȳ碭5B >gȋfv ϮWqCvW:*?ZVQg~4^ F5ZWd4 (^7.&6רsio޾><(Pd|2<)o/ogR 56Ўb%j^dҁ̬DkW j%A&B2m/5{`w印to/Ǹ8p .&/RR ,wGO4Feg)0w{{1ak vs}7ӏލ |H:RˍGㄱF9% F FʸDi0hu /.Pku_`~F.hg|~퉖ڀQٽ:y xcYlmJL8;Jd|] J F*dVmmqv=3L`lYh#WFIRŢ%tp%EBhᨴ2NuR"B#Em18(l8NA(6 `)c38+ 6!U-#?Hh5Cw1mw=|"<ݒ}~ps'nIgJ|>`qPnx.E"cA#` 9%Y/΍i{~\ a3%#SJAi5 IhE-(#@|* pJPpO(;M?#MC {/j@~H`Qg tI9#rvыr{iRSJ' >}tX@h*A%MqDi0.:15:0)RiL{1y3/%IҨdGkN8# Q $iELs0(/ٚCs|sT#Ί?:K}Wǎ$??HoM[p\LXOpws1Go?b`O~Es++ɪ?C w1zx_wQ804;Kʼ C%vpNNr`x憗Ay6% ol$F:7 ? >\Uky OV8Q(W{y ohmmzVO]͋V)Z7bBZwYr~\#*ןB4{7 iJx,u^6DFbm$QvZRq I1:(%VK2e&I# m/0'=î0rDzQXn8poɲbpVyL1yeMɞnJ&6 ?EȈMyW"wQsZջ<ێPvdMݬ:?XUֲRjaʓA؏a'(a~,Tis \^oK+ҥP`uNi-qIBiWqoxBZXZF OL3eb9̩eJ-^ޢWdEr'ADJι+YR|^۪gY߳ݳN^ŭzɇ"4p5K_'(Yo=GKNq \]!`AUf)r*4gHW㩁=rIG}v~{o,-p7 a3n ;牛)-Bߜ!xY]Y<ܷ32I{9:E*5V΀E긤^$Sir=?gQ>D#}7M.&1xB< Vd'$Sq ΂\._`"8PaP!A-ڰ|2YTGNqbR9'6ۘ`BY1aK ߴҶƦqL4ĥTL$@} 5A)WP&rT@ l)adhU6FPvwcg )j)P5rLŦ ]Pj1.UR $"B|aT`D`#!(Z݆,=!H11&UШc&γC(x$jHm(`ZLB1B Ta%҆H|Bp`,%*d#*?AI8!%7IYP,'蘲xhM(XmV'H#޵qd׿2 zz?Cb;l6X!aԫeFG#kg}>,j(Le HpTխsϭe摤.mORio>"U Nzü 9Yr67*JԜ͵$ͪ"hy0NTR<'RIIbŠcs !ʜ%xu}d Sa]R"vLMZ%9įZ)|0 CiZZ)NqGb>I'yRt0dA%)*Ҡ}E ܔ43HTׂ.I{V 9((I ٥f9 oG0L3R/3~ E Uk\4`)[=Tl<  xisW*A5V-Tbպ*u%,d.Z@8քu`1G:)B̆fqp*3\iu Qx0IiژuȆ9 zPVPB ٻVLU:ICRcSO-` jU!RcfQ1LhthZ6p.F)NqGJ V+dWVb@,AnTzCZq2Fͷ6Q)j ȄكH`E"3|ePukʃ AׂŜI' sG !.A f|)*)ԙ5TDťlPt  pqT AP{ TtgB@Q"Ł.0͑ւEUPfҞ5ƒ#%dNҿjA!NMFAꕶcu2R".5TuA"F_R]' Lcr^5 5$鑃Hjk%}jՔ`dV."2Db0T{4 {xWP>a\FФBufpOՙ@B{݋q)b6䤡bLbb󢐴t8!bE!vl[yFi:Ҫd :!JruIJ2Ba1%OpHv9k=/3)XqAy@&CJ$rAV 2*ed&deZCP1hx L}辬$kIu u<o ox>JrA#+PZDbYьmC5YK1Z1ة`U0tߟ-NNꑑw]@,M$C.2 *c+tm i,!zԥ)G R{ȋI /ѡB-1`8B Yv@]I@(ʠvs7k3m>%TB;뒄 Xut kR([|Pm!I5MH#XflqR(jS,ԚC-6otI{H$Q5$YI$`QڀJMfnU{9zX 2IIX@iT_Cŷ%e SLЍ![[&׃܃Wq .,xxqڮ9Qe,=j0V$LC'Kac+i0 Ilf1hmTkM!J5<0G9e"|Z!xJ@]{?~M{wh=}ZO܉coH''w&n K'emN v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b':T&t&[+q\@>OwJH; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@Bg< dv?N uݜ DhΟ FXv)iL v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';斢'*ߍhMy'P:váGUoZ08GIysk=}}Gj^40 {dг?7{-.oHm; iu}h|b;(v@Pb;(v@Pb;(v@Pb;(v@Pb;(v@Pb;(v@z/\z~:q{}~/Z=_/B{2lp P*эa hyPzņ=4l9%'CK>;O5/~cR)?^Ct2Ō8f :C}@o}yp2] ^\%wr> u:/-F^_}*sui|5;{=~Z八]eU(? }jA]bT*s# <` sPw POB/hP? ?0H{:+wCW텮 NW0]#]EzAO~ ]E]+By=LWUxЇC!at`̖j3ۦP[L U`zKq]\#{+Bj[NuDWiYnZ ]ڠw2ttާd $:]JttES1Ȱ\{+u":][)jJr8wDW8n pֽί]JkW{IWAr.L#8TBzmEX=3׾\ӛq\~nZҳ)Zf~qd8xJT.] ap|9fcbloݛ'5L Q Ƹ{ퟲz~ʒ|PM͍u64JEzYILs=vh:o sSDg# Fe“FXţyhzdq,>bsT95Z1)G+m=Yx"RV;NN(y^VrtM=ﴋ={㻡+뺡+BPe>==l\/Y!zWeeutt?nk(v2JLW]U8!CO;JKqO~*wJAk+?.Ї8D7˗^\-Nl8:¯*} zC\q} }C.EG;#~"C1t׋˓z7b`ZhNC^|:̆IA4ώ_Pnxiҍ=`?xq}Qz0_6ٷK5oޯܞml̵jܿ꾜g[tr!yV{ڦcܪxCVlu~c[Y TZjV{K]Wt~'l ת^~sNW=+zޞJWꆮTjJ눮8 ]n"Z:]Jjd舮Lp4:]\ߍ"6NWjʆ鈮v*"O+B>U;S3:2P*~[QΛ>۴aޘ.N hЪ&,z6=ܑ,lwG&n:cyw0͸( {G]?nf h?+9B$Wr{Xhΰ) c7tEpu7{< V)0]=]!:F5 c=/g_ks`6̖C|;9?'ź_)/Q%FoZyyx_H?>r-;r_mi@A}ҪWUX _>]^}U _G+ux+sy+ۓt3k*Y])_:N, ՜>_~|w ku[ 76~qlGtٮ/sov1r/7s]g=yvy:NU8~h ]k$`zɑ_ifiU`3ޗA7-8/ŅaBixx7ܟM(67sz)+_IƢJ|]okZ ҷ? N|nge8:\hE٭7=_j4x65Xowds :yWz,WG/Ǜ೛^*%Er;&p]^r㟬FqI@MJEdYoOzW#}D=[lާioFzX wBi|T?q?R?{r!%?\|7LZ} J:;-|=ep a4NBTYZ]0j0PL ̣,Xߧ}6eDf%ޭ'aܯF|@x7&ktxœ^# &LZ-6>@HҦLSLHA邰.E:kk31+D5Ƅ k4M7&gamKV2z#gUv?sھd˧K{;^MheN׷gW}MP~wuM5nB׳[u JR+|`ۂ>P|WLj@5 , @M:|.GI+@ځy `r,Lj>8# ӽ-CUJ9#4rFdH# oyb|mZ6;2`7ɪ=PFQ"RT" [#0\3Bd8]"ltvu3*qPR]fڸ 73kw?tc!_~q \  8rpϚD:y8 (-]7atx4b- -JL j 9F,Q%b:;ba&+}&26ԗoL7|chi%ښ3yښj0]NBYE*IT92e"˖E2-%8"CK3hc!F -*d렌ֈ<\ ^$D>oZε im`AsQzFpI~&' ȲtJrOihj'ɆF<˫UH;ZU^(UE,ǹ9&s&E@m Qe툍c Vƹ|;>L Cɑ aaQk#o:ڷPk1,޻`Aji-GV<-M.9k X 9&J~ )8I̲J$ 9*1@J$VkP+LMg'-~{?뉻+,2R9R$"Bf14 23 I3;@% )nw1Iӻg ]qB&6خP;ُinnx}= FGT G' J +hddRJ#C"Xq踭nYE=9km$V0 'l̝d0&L4̒ ̀)dx)cƪ0?hme&ݤ(?]"բC,K}Oj Q\c6oړ}DMyӞF8i yx7L 4ռ[ϣ#5rEh@b% ~7&^'c_oA~%ihZ,^K Yh*)2-ohhKܙW.[3Y1$‹H/AS'` t6y2**\0E,*#J 1y5fBSfr-F"])@Iwk%,.}Kk$n @4$(eJgBRjA%WS"z[-өᬡ?}~L.x"%,TBǸ*fd :QXyH;ӎױ=<M^V+ FlsT2mVGŴu`Y|l$O+ue_m9w"|c^)92f9CKPBEO&@ lָac5nB= :Dԓ@Qۖ]& -6xaE&劫IM䓈`SԌŵd=!qh9wQR$V*wbb|]flOE&3\)Rj?/MGh3~ɭ ;6}/G7/0(A?̣MO.+_ޗZbqN'RBpڣdEõ%[FayLҢl9t3ASՆ< 6_z?/Sߏz>]>oUk!QM|ŭ1K8vqLYKpӎGM1jѲjlj]5k,Z_k׶ƊXD6 . oWZ x$Ο?ɽo=)Yq{= Ȼl[To~ԫxLt<"wnr/D?s>m=K!p1e&iTwViZe00x/L\wC 63z ',r=].!EE&) %2%k-J^ޫQx.j,x`2_&fRWQꁻҌ_4XW˚_hcW۵9uf1rK͙W1xY $T B revf!+r9,(lD&̼jȬc@MIN]"$!%(7e/m@ܯCT: ;v^Y!_UgucRkkk)ӭ]Kqj v˸e;2J.JH-Ze=! Ru#NbP0t4`]) h7\))u].֙bV"28*cL1{m%ː#&-"mNnYL8) tdH,K˽6#W3+U,p-2xLmkF$ PX̵=[|m@W=vIz^lҲ~ >:FSq)k%Y\W7LF*qjg3W;E4c1`}xŹP'>%Db\O9H]tΚ >P*kuq%Ę#0̢].5%yg]̂d&(CL&q'rGuV{-Zŭ6y;#i+jwisT;hukG!v}#K$]r`V7i]JE76wd2HD[Mϗ9ZeL9G+izFPt!WyeOV(Ŧ NyD ՗W^z%6LFΪʣrȑ FjĔN:D"+ Oabj|mӃ,%#,*Rڀ+|Yn22v m`,K%'q_[/;,K-{Aj6Y$?VU oTኈB\@>ǮNJq(I^۰?#MC g/j@~H`QgP!tI9#bvѳf{niQz G3DH(0HKQ~!5vFaQy)O"IƩes0F1!քaBl 4agm~4o'Wgagy܇:v"1gzxkr֔br=aINz9)~=y,fvS}ĸhP[?8ŝ3ɹe%e6F ]3%9$^dxq=%˻7ZOHt6#P^w/@i^"7 iefi/؆ #NǛV RY} O/.W*9Py<JaG n]O׸O|#^.Ũ*ŭ>JulZ_+owՃg_Y̾'Fw b.( gf9+Ƒb4rA|to18z kI-uͰ*Eh\`JĆhx3ɲ`s2y'׵ZhuqF8>_\$Td8G]z+W*'+Dž\~Dv 7?{?ǷO)ӧow?i)#AIm{~H@ CGsMK]ϥ9i+]yt:#Vn;cڸνr|o\n<1 RK@tRR~F=J[_^. ijwϚSuK,9($F4( K9@DS4㎍x5nZss8/}s/ _J-@ LyYHR g=9uwk6uuY`U{ٌN^XU:.a}y.m?=rMj6JP %&2g7kh΄ ń2QAA=P𜙇VP{KM\gG]);z*wNXPcS,/'XX8wM릾?Fmӵ<#|<#KdV^m~`;¶?踆8 W,5< ͆mUbES1U,izboяWu Uܐ_xU|`~ "gs%^n#x%ӡu9KU܏ܕ~jzXj~T$Nq]TI‘*Zg>u+yI-ٽa&T_d1]ᦓ%<|v - /[mA^ s!-7T+lݲ_', ]js-O*ȝ70ӢFd|gz}5{ o~U"7,]E:__-H6Av vzPkC||\WP3V)y񙔟4QU!{5Q@lIeZ: 3&2C sպúE!u+"YB uU&`oHZ$CB3-:"^kƨQ AI'QMT6F)D;MM.x뜡DB@:(&g F8 h4FΖΈ!M1NPo^F5W . wFxJ"Ҥ)ոS,ßYōntF ʰ|J\(Y)D<.ŸZNѾ9/X%.PW9?b7mlOf4.FqkN挭ȓJ9p*LYTeY{qϻ߲QT~'CR@n[NsI(\V%2S50@$aXBRBr:QE)^+!ǂAV R6!H5026'4qƮXBc3s/r*Od!~XZ<|lr3dJ; e@^$&VT#=%UL`RPH"6K pg#9YM.$LP u;Mb`"|L+t^yxVT֠vSܱ+jCè =j&G* &)M:Z2-r3<˕1q$P#Q6=!.Nj1.]e=.nԍDH)'H6J(P~)-aLn1&Cbk4禸cWjdM`r+O (DqŜi80JRQk"tD,hacD-1r[E ϧ`ZP B,p\"Q@ˣcFOt,-dD/{_)Y)9כ% IʵIb /M b]y@hIS?=,Z#BiNt9OP&iEe<D:#HAT \cDkh$AF#9$YfE&&9A[R.p[ӯ6Xsk?Z;yS T7g2 ԡ9}jx APZg4Rs  DmdwMEh+7)퍒pjIDi TsTDti$ &qjv`%h4cDhjMДǻ?3nUg>mT=ۅ9/%IyDfAhsչs=9B2OYƻLCK,]?.қMtQfKp6fU!,nj^~Y-j^) ͟nޖ}ކ uqky*+Ý2J 5ݟV!^geK!?%JsuQBG%ѝ ɥ+3>S  CsJ4R&LL1]+pvT +LS! \erEgLmLf=\@T ;W`t=?\e*%•*mչ m!rYYGg'ś[]e{8lqdPq" ſS.p489whII~PG*-|{~KRHUGIĎ]X_=6R#dpYaqbQ)^>=0::drdjUoarWlP$mur3Xbr>fis `O;!w[ڏLD!']2"#w++HWvLm՜LeHL5+$XXe23g^ZaWJݟyD2 \er \ejy륫L%՟ؖSώF?cB8r3y=Z WRvW]rɕ\!@ig"Ww2R2JpqZ8 \,ҮUX*S){zpŁ%;WH0}G*kTW e2@{zpt 2; +t2WJ{zp%%U&XwGBj m=SzE•b jQRVQPs#[?%;_0K.80]q_ͲZ8q.ʳp.v_˕˷x='Ӂ_џ0BH࡜^]#~Q! L؃DeJĔN8S&&z/+{3Ķ0rٳl#,xYp JPhd%pJ^kfE4 k~;ɷbfL"%ܕ`, dgE8{K#՜RUIqU}0%89Yy)ҒdN 6pXJIȴՂk;Jw\ɅΘ2Le;"O%!ʑD٣)vǺֻze*Eopқt+*km+Gaؓ⭁~> fz^f1hRLq$GRN߷x,9e*:N%Y:U,x.pUE7tR1[+_3*ُ|dViJGg+uҫB('N\M_eoruͫZ/j7?.L_vaӛZ˕_(}#|/W ; @"FZ}KW%݋'ܬaUi8^__KK)FWv "?8Y]5ew~qDDŽ$9%cXʪ@eo 'U_ ½)~Ǿ |b7GL7 Վv &EcD C脓dfll)_D -Ogt qfjw=LC oVE)V N,4f "@ U E!֟VI嵵Ļ`۴j cpE4Led/26f)wC7Rc) .rF*]`! M֑RRD,e0/_P~[A6ʣ9V0D$] IJnc9gK҆^GU]M8߸1hAHِVLdMIC+QUbVA+=؀llFizC2~NVmH#4'՟X=?#Rj HE";'됂ꢂ)_u 9-pEX?2^m>:\w@lg@Ӂ!΃b-'|gfiBɰ;룄6Q948Poҹ2L<ŢL-eKU읣V`4"Qj]Af'VʠĚJx4L[㌡v',ݓ;p}L_~wط)h1u/LRZM>/;Ey75eu~yC]<;<ćbL$jʅ :'l$dL _ 4o]!ٮ:YMڬC7#np6޿oGL6ګ7ǔ3?9zttVO; Z}!w.>^.\|{R~uO'nwim{K/q\ ȠQR:堭1N2bT3MœгΪmw__/Y-(lٲLo\қ̔Z+ m*(jZw| M.sk iSɽX) Cypݮ}^n o{0;wZ%R8joy dXW^TP"XeX0ѲPM.0`ÆlْsQt1RRrieW KHCaj9"Wi4X,TPXxR,Ce[rnZ9||#) Eq)*F;TNT@JK(U}%[M*Q| OCuBb6`4-l v )H5FfG85s0J;Eml8`wV?"i26wHbC 82x!FV9IMhcCNKdD"XI$.&QB,;dChُSJ8Dl"nzDwiy9_I#_TT y`.f IWHH,L.0.iܖ4eW'VWYR6?8즭Bw1*4A{:I^9uV %Rf2} .Hytd<ă=]vtaHfi Ķ*0Padg1'貜RKRn'ԒԒԲ.yk+DΒt 8 d_PMo?t[7bO*m|ᅫnG|]=u=7?ohh5jQ,$TGQm lBwADԿ \RPRnvk"&`b0}D)g =Զ' R6A4%GJX Rڨ1goDIk-s9 CjX4岰[Z'P+~P_nPտN D Hкt⌮1lFT rYnAwc" CM HļԩKRc 9p;|Qx!YY([+ MzLJPCuDR"xΚsv;0^Qw)*=V:cH!LKjyq*('yգ 4E@}NYy,PM!ECgu)̶1"9'V3WCR7Io>Rd@I7ET;t$¦ CYZ"3FWǜRL'Jy:πTA3ԐP4"̆(MN'̙PE<8."²}QW,Bf*24;<VrI OPd}:fr/s\uKh㎌ fru4.0k WxH{BN9DM8Y%$&gJ6QN,2 :"&[O֯AFAb<t.41X'޲kEG%EOGy򝝠m1YRNJ)kۚbA 8v7" &KnLrN$0fW HJl*'Lřu6bn9ruHTB;RW߸'om*]8-Ӎgv]\gS ٭ȎĎЃ<;ʗU v i=UL/SJ7̑r@xJNhGff:])ݧ9ԥ˂ִpX6-u/L$(ˬ~?/MtyW=G ohrqu;B>y{> yl[۵dq%(uT};Oܬv;D}'WW9k-lun{F{p}َLLLP׋a.\Ьë~Ӣꀉ8bGDı(V'ObBiL%:W,M!f\9ļ*m]LP 3i۾Jk J鐔E:V`1:H2JKbX n_J(4ӑ欙!#5~%)޴yϙewgy/u2*Ji:u txmw?+Px*.w٧ Ӕ_j3z}wmI eo2bwskc7A'/k*H(W=3|H=HiĖ8ꧪk÷I9V[$n|z94t r1Z(X $!HP=}$>tE!>}A<|(-Bd*x̜Hu:TdN 5η5F<5Ta  -AuuT˻DϷeϵTw߅79(nSx4=IU.\0uw˳y/>8&)%Q+T8dp,R84E8{yf+{E%{^aٶ?vC%=^4MB^G=i~qVehЬ}?U~SAu͕:SjqPwVCxǢ&G<|GE…^`Y+Ńrr>]z6\&NfoGosjr1^#Gސ:g#;%zDta|<Ӳc\)U@Q:Hi=yMim%Kup1y:x\du}B* hDHxV sy\Ͼ 0Z/6[/) OWXy>s/:M*w Ip* J2Bw{J5C:&hi0>4G#$gJZ:vT"n%+(3qW%9V^2gXÔzH{|)^~w RP6[W7 S/40k+W) Xa%Z|S"s%I#Lv\-TUe@^Vޥ EECJ KEB m 1QeUr(4( '3ǹ*Р94 Atҕ8:KkKx O-J7MzAHŸFjC0#Qt`,w$D$Gp+:~tvmۋ թ5Wzi-yBJ V$Bz0R-*BT\f BpjAc8 vKYg/d& ME#0?p$(:.ig}D,~h\"`YrT[&'A|>``}P4&q(WniF6žW$S?Q| NH JN31)7dcA %{y{,,X*!pAoԋ-9,~Ď>/h8xч0KZ(N~3CU vc>x (3rp;S7<͵ſFhQ-}Dg#[*7\ QC~rWPϳr9! ^E_eqf+#Rۛod*t棒 -qQ+r[c@(֦9'} F{Csq垚E<*DK}Cu:0) jKaNq O尚۵H1i0u u 6O8y.>.=[9<9:*#wɶQ[hPQg9:|6,Td1uSv+7;U&~;˅r ~>ѿ>OGwӇg`}&&hAox3Kc\M-矋x;? g>\.(;Xxirspu_AqY^Y.Z[̍ 7"v54Нu 6ט.oeܿ]ҫqiLaliGAu6bj{r;hA>V=XDx팢ByǩH.2WZ*M#:I}6,[/JbA[A=b]u^rFARK,.2{e1؀kym?Lgt&^HQmNv@MT[`Uޕ܁($"ꀠ1}H5Iv%gKY^Ef)A0h*,n&nj‰"W51<=.giM݉NJL0QFfpLjz sKp o+`W3z+C ڀnQʙքr/.ձt&CO)MeSQ8#io('O' hZWBd:Aڳj=-[i RsHuj *LCks`d\t&X`ʡȌ % &til'Jy_Ahn29el'pVQ#"(g]0!ZBiF\,<Ğ\-ipgbmW7{E(ZۇҞO6Q|u݉謢oƨ Q"F&dpD%|H47AbNI-*:bTJ6raM魱:yxfRTkƤWܢR%Olܙ8k8knɫ7Sa׫kZST>~EY>g GuѲM],.}4Bx:YJ&:3DRTu&-%\yAK5+R\ KEi# b+{ҰU[=(Nm{5݇T 1p9M >0x9r:z 2c2AhU{0_)҅  ݜuE~Qm{]CӅ bS!:<$Ǚ&I&ŝ 8shj3s+Fz(!f7 `L\Xr S>A)Wz#(+ R(f!uR3vv i)Wsz|1C$O6c:nﭲnC-ղ` =p6SLEa uNDt #ѓ]b (2 7<t~zjzz,G4D4dQD絉ZY`˜IUoy*N `D0g{J9(K"B(K!PХqRS4+E3w|\+e -ZtvSp`>~a^ˆgn//m1kIy! 3\BR` ckє )PI-6d 0#Rt"jhX֜2v]R0JHIj/tc2' ^tWb(B|rvVmh.yA~Zx)Zb*UO5o^V4fQ Dd0uEJ\tRBq\]:si[ЦŽƐ>#sKf wG[\o$vkݚMDf!~kMzl&ͦg6 q&ezQm[O!t%J+lUj[s#M*"e~Sh 7z謐`,nfH~<,p`k{}Og&lMy̪.:w p.v _9Y."}DdۈH3s4 wo<Knj*Σ'"OJj*Hb.trUK{e#xg;#_^&NGQór`%J}F ң;mr|`[N؇s;2zyt\V!Ɩ| Bg|r2t wV15G9FjOL @@wnƳA@2Y(;Bs"-Et JcqO(pRr>Ir₈Zx4ڰpSt˥si=ڐH* AQ@A}bii O]Ĺݢ.X@ԕAavLԀϯz>lⲮob>hO\Y=/\YZyN.9^!'0\KG 6Tr/*K+,zJe>+,t7WhB{t*HjV~sհK*2>>,ޭ9[!:cDJ/~}}1g|Gc ;LDo[Q0ZᨺORbfMXF4<&:PMa^LfeSKW9ξvP8Q[ [rӋ\=l}12𛣟ŚjP57 L0KA(LL)h"$%7 {} (0R l>vaCz4B\R1WY\\zUTo^2XBܟrWMPeﺹR2ޛWhdU& c \eiΟg) Ջ1WbG/7XK(M+*Oy8Y^xj~f$m/)|Aٻ6r$W|`&ŷa6Y`{b ƺؒ#I)ۭ_^6?0?iG7s+o k}U|gg[aYYcN*sDAwW*{5qӓ"u?r~I=Xse&ȷsǁ&PF^.rټ J$wzZ3UwC_qis^a[XY&)r@_z{/~_1i p"7B$`xTmL4 B,5dK4KRRenS18-GL2E4!:l:[\_Ci-k:&/7Sk!v}7pyjچ>KBnzz|>\pݹynLr鳋ӡdЮZn}Yg=lsmKlsX>V2Zq1*y+ّ2Y95Ʉ*HöY94b,Tf'3:hQ%J22! >[dhE 1jhwKȴAy Q9PeL"YY -Kdkױt6 = ySV|Ѥ_9ҨCE:e;\W 7$>C;֗wwTⲣm% 2 gs'DM\QA6Jb,8C $I$ 8M\rI3*H" Kp1-:ccFṈ2N&cdIk˄ 4HܤR> nr-gMgK9ꏯ@̚x5|YF04{,0P^s!H"=vˑY-9-"g% ڐEKA9.5E*gtgU 6j)s&咒1@JkMP3fMo'=-~; ,2R9RtDݠILIT1`HҘq/* `HIޕn*ٞرª' V0IŸc9ZYȐc(5Vzi#BsT==']mCm;$8 2wMg\t2p2K02 ,Iy:%[ƑttCvݱto z1-501jRMaϠ<CNj QN>䉂ME}"HP蓉'7?JPI~L+$|P6pW(T<;bn/xfGκOpJEbPtCiJ,"8Xs`R!hGWE@b% Ny`)TuwF6*fC6*Z鹎 7cht6Cy*qO(&T }1 5DY/1I>v$~>˝[ۮ[I9iskso S7[ OÑsYj#=GΏKsqޏH#^sdfM}U_.R YlfV !|*2bJocDrr1`n-Y63w\j\ls#-H+th=h{Ͳks_jsk/An>N V)> '|>Ij`p۴)H߽m{gчŇ^5f#-+Wlήm.4-wz ݭ>NM2/׊}6+}x[BwgeM+wJ&IٛZ:  mA,+ɕ:,]APp:\&l}h+*:V 2zz@88 IqQ%jc J16 - )4R=,dt_Nl䄪t}2M*se`#:fCa)*;Mg|Hb78;;oEY>>4*ɜC az.BZZ^0s*Upm% _;XFG=3PrYxj:I:):߃B:< 5)ZP9R[Y)5$p㥉@&^r@ 3lH6J hjZ8؀{BȤ"MI|ԖhYĮ󮺚ΖLH7K4~X.H܏# "RB66 b%vF @C,$ siNTbZ<7vv2u>h-E :4p2Hv!TD$GE4.}P\uj'O)f",&Ff%##<$!?Kvڻ >4>3TdVn,մ ,DRH *3iZ(B#B慡Vܚp RKV;@OS!˜O@v-[i!Y6 |wV;Hu^;)e5_^~J?K?K]`/8?ďZyÕPOhxYBǏ0/fpF4XqJѕDDƹC0|;QLgZ"˪a {"4ˣtjŵXr$! %K;+nuwwel^+Q]/cS8ZϵwC[ԏՄ/<‹*$Z?NVhȦ[$IxAzGQڟ=-Z͋k~]T_dU`v{2$UB,'s Gtd^sd0~qh-4GFҬu$wt6 cwYސ4`YŢǣfWcφ57gmQl^O%m'u ΋is7ԬN;ax~BOt{ǟ_w?~y꯯޽8Pۦ S , /zix;peC{R‹ <H9N 6_8 uYy2WnD4jhZfNz ט.oeܿ* g>B -/ s/^2XߖmPWݝg}tMF H^{8ǬvDXS2OMK}]Rz鱝 r|́pĮj' J w>B %w +P'KH9EtB3 hgw:9(bWB˛gOBwKyb ۙ."&m: a!2+Xar,Q'c0" ejtZGs2TY^w<Ӵ9eI2IHBG3PJdeeOJo YP`7c< ØRdγH6A}6cl:>0E@QJx#``O S6Z-h  :ی,ZH&%D."Pʂ- 0TT<ÿN_jƃ ;: WTp] =;1 ψL<Nddp"ق=tN2ZD.^'D^rY!bj69?M)b̃U +%Vp2Y@'{.F,CFzy#ox 3c@DpjWBAXHeURKH!W 5Ed\9 Z_ c1qv`3w6Wgr=G&/vtmnc{3Y%«u-\gr]OQx8"Fy܃PdH Xт)=ndQ-L Lg0@O/ !GМ 'EДQE+4Hec:%)<[?NA왑^J$e R h)pSj #IㄢKEsZ_LlDŕQԢGgw%C&C ]e7r 7BK%@<+pw&B=uj?٘}ւA$bX$L#s !` (,W#LF&U{>"'>N  3!w~_=3kd<;]D#;]Zl.^t<_<vy; wע!;7ͪ=ZD;ATZ жroCQ*"e7vbL'`Hl?BBz Y܏pqJlַ %k+hqq^rv'H|6 g,ŦŎj9WBA]+]r^5@ӌ'lյMاX٣Sk"ݞ. j~/H]57yR7פ5#iIjY -?_y&miXe366n|7iUz~^N..hW;|/ В!&{5^<}в 73$iZ|mZt@e !M?_ߪ*($6j͎=rg]͍6q웚/Yj2L[^mW-u~ŵnoh=rl=#nhʬmחc;q|V29ՂZ4A4OC!TNz5}н؁^'țWb9*%ErQ} 1*džCHHl!e^^N:/ڻm =‡Ή#a"pX'F Fʸ\ڻaqٖuL[V( (/:Vve]vTQI%l|^eikzvR"v1|[ڤ-[':n~YۮK~'sYOcϽ(Cv bɭ7ġ |I- 2ٔjdM`r^ I& #( -tȊP9&z|h@10ZL)qfKvz=.DotA$#4~}gˈ+HD^{ϗVMbi$n21K$*K$ `ytHFhh/4KF q %y WIpԀ;m<8sz. 6:C֕b薹OAM9MAԖ0,')B'e#IwHB4 4Ӻ9ށВ ćP`e1eyՂHJ;OuR4nΡ@4*Y &Έ踤[Q7XJ 1MK"54 c+ fu'0q ʱ*$w5͢εce~?h }l+{8wsrv/ǐ}tfM6sDfUַoy*gJ*rXqqp2#'lj+?~qφeݶGIɍ@h$dM=Bxg͂xK6޲WoZO⫟:~ԏ"mmμ{ۣFx?ڗz-?-3BˉHl_o.[ͱm2|4Y>`9P˫esM-T?[^!U9Ҵ{.g+W\W^Φ׈˳czf4@݋kOٷOELxD֒hD6_ ke=j=+ZuKEq%4Q>Jy\ M@XݸpY]Ӽ ET //qMn_r}ݻ{1}Cߎ͟IO,3s6}-׭fB_P&|֗L{[ˤCZmꭦ^;U`xk[v;{&a8x5g_ -&o;'4.@AIP[ 2-ڽE\]"}DQꚒqސ$H\$f>Zu2YE֌QN#Dsl8(Ebh18o3 RH(EImrpk1YCXLüB127+Ar.}{tc.Bqvq$rQ;4)irpA5TrVFUDF 5 E4eD1.@)E/yXQR ii|lbt͓SϦ;Y̾^NG}PҕX`pN.876`!&|9{'_?EQMN"Hi4i!zn9 %PJ 9S F IA"*#-5r)NJAkB2N AR-lB*Db1R ͌}B>`+~j5-23n/6n̿b< ǣGGl81IUp$:T \ 铣,b#_J!{60ceRy NBPж#$&DҥavIX31K͎}Q 6 =0ح#T$&H&e-9qar-jcNIdAGDQڏ[d8 'eOM 685I*ʃ2θ༆(l*;`(n #b1qv\8qqs{Lbd_\qQ8VHA$ gr9+riTCi cpK=96.r.5;CUL#r{x{[Y#6 i( g~ԂIu,?N'Ģ@#hÀY_PH6Zh͈K aO^0瑇]&̓UU:9Y'sOr;[ǐc;Yf76Z:3uYC<F;AZE aR sLxʈa"mr1qep[JHrg}w;[2`< ':BL,SIҲC'z||,R'ԉjN P`)UWS+Vyp\?#=ˁ+#5_^b.mM庱1&od$"+l!x#«%_?~M&xZS8r׆*ʁiy-)b4n:J^$闕1.$ RWs5|Pad'HѤXEjl_4GuFgLW@ <ԋ8=_'Ivدw:a7JW~f=1s&=?UKWZŸG*WwI朻!91M3\#{ؑC ՉEV'6J@,X +5ȜaK >KG>'dɡB蓱䲸J%JBݒR22Xr/ВSLPNP`=koGe/0b:ۻ8p1~)RPr_CP$1`[b7/%RU}lkdW(J 5Ůb-U]Bv)蒤+X)g-={*A)ZaW|ǣWb`oӱ2֣ 1ͨHh %oRFĩ>Lełqr3-Szǻo6*rrϙ,7ɓ wLE5ͼy?y^2|b4y?(ς7=x3oyG0Ysbw#,K>wl!E iHR(Y$mBHYj3`uZD@kPf>k;{^3n~A@S~d}W0ƌ"Soׇ-6\cuWwofky']x\ x8r1cwK -&gRki"UHjC@_g*B*_3|A΢rQ9c"̄R*ƒQLj)sHpQK9DŽRnQ( cub Hhhl6nc<:_T^%7rMy^:?xp9!tz.|zz{BS޼Y#BW|~7KER6@- ~8"@V}$g!2z)HN-K(M*+!:nqD;6u10." Gqbp"r*ϽߝŤ!57Z6~R^.!'GQϝZn97dҖkN 'I%GlD\#{&CD9I BDpCF,s%0 a"n x&uTbNԱ S^]ENOTd`Ҝ}on>Oҿ矿tL|8U=_w[(՜1rk)绁{pN[i O&9?]DW~@YRMW Q-9 _v?NMeFW]*kqe^-|B ʇ2]f9~]2gVhYKGGՑ{: ^[%0R̢$d Ci,t P©Ց^0e.yI=@qc4TIȺNZc$ϱAjʨ3\hH[#tiG4}iMs)NX5\w윽+O\ʳʳAjvNⶃ8n;E1054G;SYΜSAeP*PECs <\xU8H+E2K 81$3R40hVk΁gM[0,D(Df *bT{뵍 hUilJa8r‚؍>zT!$U)rNzh@g_i] ΓX(qn SVq#%)L4Z1 `B 2Zb9:;ќd@AA,1a'PcJ<d ID88ӊNz*x0l% ߈9n3u[3aD9B*֦ sϨl3.|0 ;g8+nn\uќb]aP^8:\gOwȶb!a)\PP5Czd~rVk/;@[,A1VZ!j_ SʟMWOnX}_2t( e_a_l۰%uG?9>_5E}s6&5G:#$ ܠlF70 +(WI9-fA]-^*C5#p>N gq4n{rûtVOn ZmǍ|'#VdĤIe0*:ωdUD:)Z"apYBpA+( `iw1,[G,~a9U\?yAu ()g ;U\~*oB#wGӯC#nyxφ_X3_Jn NԧBp nCmq"BN8 fz%-"ƬLgΎxd-K|>τղ>0.KL1;ѥZH?M16ӂ6H#P) ԖX/<#} JaAL9h:f}txS.z8Ly8 Wgdrss :!f3ՕRMmœHQq; jN yvٻ8n%W~ݴMdрb`e W9Z˒<#űoq>#JjDh:&UFWU\mzP.IOQBtgd jdH*I&c7-դ| At#M7Uk|:LVhN)Ghd N+[Io<4oyi*wv*Nِ4pJ|i(kŠ LFj / PxL]UA5Ț:foR /j>|F&t֠Ձ唵Zے)؎tH\CM49o]19{/g/֜axfv'HTHp6bYQxoP3G}= vqA%]}j~l)Ɠ{-A'o [lFM%&ѫLg%l 7?ku%ҡ֒PTrL)hĪU)b#L 69u!f-&n1n^ۏ1nUL+dJ0۪ٻeYM2k_"ouuԊPAFx'!WdU8c jlI+ʝ9uUz -V0.>L~KȜ?킒uwsS2d%V\^iYf;'>?Tv!wEmq. &(Fzq I^<.\ TP9pF`CwEFGKTv$\hRX3epHPb-Č3Qkz#c7snŧW醅zơXh:c\_(>hx rp"/v7!?>ZؠbUvT4%J^2B[J 0q!IS_(ЂX@@M|nX&l&J S7RY&0Y]1'%BfQl)Ue(;!QFLmQYc쌇ُˠ~sրq,ح3" 'DǃE%`Bșe譂ҾWZ6p1,uH+/ՌWTKg߯tS^=kZ`iÝ [Z eϧo~iYz7s}owVvnV\biSF,[W~̹}̾g/W~X'=!F!%$ʱ/y뉇] r?V|,?lPȱ2R-avWr_-Ilu#,v9:[|b~XKُ?4[ݼvG3؛ɹù_r{x]ssSo[s0stY~B= 2ri*{oo.K'4P*zQR$F.C$0] U)b[PR1q@!˱*!c VGG5CQ5T)Ak,&P~rnbϤ`蘪.e|9b|ܪ~"##7Z_PJlESaqlK7vJhh`'çzߌO~z mTLPExR2p6P0` 5C@QQgy8\PZ:m+`A3%j\F`:Y7sngK ٲ EPH!(Wq B^yA j WfgHzJ_-p<(/L.ׂ11hA~Vj[,l+VgKZhUdհV7Ϗi"*JYE **D8uiA;V7ͮE=8n}Cn)~|+O %*%jl )K6WR#6q@AZ&o7#"]$JAQ.26绅{xC!UBk(f,W*mwjb2(^Jy'= F㒌0brwtMŢ 9NS)a=}%h ')䑒M?ڍvO,%̞}ȃϹCFVD 2!9H ôOdR}\\ѵ4 Z[zv/*YcTHMV}]q,rB!:+Ln*fGW9P$r3+&K(|˜S@hoӻL$>pyep2S3i6?>dOie98Nqؑpo'/ =ruP X Ƙ4` N8RiG.Č)SO^}={#hxWWv߀e_G$5^#\z$CⱴKx|bۈYX* )&J8IGމb|=l)eN`!̉=6)%sLEeb]JUA20! }*`لXbkS zmuҁ2CJ^~; Co^|4t:[Ƈ̳[a6k!Ǔ;9=|ӗ<Ap?YËqxE 40@:  +cҮūn;T͌\*FW͌(EQl[R %(h42dm):)CacKLohj*:msȚjp/`D[RV"F'jq{^e-!Wi+!Zw3g_gz_nheomym P^7h" 2+*"3T"uF:hꋶdH֠3:f\ n$" 9ЦBkfr!JDʒTN`]T[o:ΐ dKL]lM,~ş&'ÖײѣMa5@bh!!Z0)%dL;K$!Ƥc ,̀#skN,}37ŽSO2Ge0u/˕W:Yù!@OZBN X3T 1QR헰z]_A38'mJC IYE!?t0$І̌= (_jkErQɬXL1bPJ8ufҌԄFʜjAZjŝMBZ{AV't7Ʃo1C%` % "n pJMޞ'_)_wsES N/Hnw ;t I3VJi|tJ n',xW$2̻?}q7 ۹ GB´>Wg1q0Ϭt19#Q\QG]V1q;:1ZJ.㑟N<MgϏp_z˘ G0xdN0^\ŹKrZxD|r +c .|DkvzwVWb*H" 룸S+r y4%vE)mxyQend[~l̩i=7/'kewW61$UR.9tҭv92O?_Dv;񷊱 kIۆ8fZYg$ ǓZ? U,ddݘt<(׉2t{i1p9l{Q9Jh+ϐPKEi%<C<%瑴#rvɂ5Sk,9$)mg2MrL2J$$ b Yj>g< MED$VA oPjfc`-=ѡXԁ`7cSN:eL`5 ަDtѤX@:#҇짰H9(1`D%3~2{3T+W~%N@a˪KQg 4#d2Wًe(q)=̛hFh|=vJ=L.r1qw1Ⱦ9wf/s3TYnv9R*jvy1\2c=RPUVx@'>}QpFpVX`"}I徤%y5"}X 喭Ta:T>4YVUo|?BT sQXHH6ULe}FA=zRbOuۻR{AT[N2HSV x8"P)4s.\_y.e)GxKЋ˳놈Ӯ'[Q\Vh*5;)#a1˗K\N0n-;0bCY jł+DfMe4nF!!FǹvsRy@F^Q،#D-= Uw,soY$i@'Yn;Z>;Xz:j_y][dYBPBY=\D\Jї{:lM]=sYzKI\w2[UeJH|΃zu{=gu{;|x#=/mݻtN6J&W,(1 ǔe1?%{c Z1B6)^np՛>| KL/fvw;VH~ӓ˹  )9Mrd&Ā aAKVV>Y4B再{+WP TCvt̠Z*.cbIr1L4`*0"n/UbBkNK-2zZ)!x[sn{owwJkVV$CD1g<22+ʐӈ)zӍ\NSzZUOy6#  VrIJsҀh,y8/bB5OWTFOn*8x:EV91bq% YȤ `8)!ʽw&{y }-ТEfw]#Ak,O 5,";K]("M]Ѓj:pg ̅ 3' KH.dRa"'42*O\Ngy7sd lrq1ʐGK\OJ؇ ̦ |pr6; Gqrz!''mmxd/*EbJ2^NZQN&k4H,I(wZ-1"u\J?M\vB#k?j1%n{Kݯ͙~_[uT#N/Ltun]-~-%Mxxk)5Zx=ԩ2ر]z7/7QqƙfYRq/W-9'`hmFY-\&B%5/&?oAg7cu آy0'HT|YbcM{ ~5(pl.ǵ0lB#R4h-eq;a.܂Jj@RBEbg6u8㶘?Mkӥ4\(1B sif9.tw;IMj|v#B4߱0SyLBcxÉ;(j ѵN{l|3]0w AMg&)Ϫp6^kY 3jvަDzNlbuŰ/w 0.;:YBD~oXMr3 xv;]CDB5AKEd޹LV“%~O{]$dFƍ>ݳx!::4 (=/3Vj{(y#JED`\LHYpӹ#W80jʍu:dz,c3Gt ^WpR?OJi Ai=4RJ >OS%meUW c3rUBoGI{Edt +mBpɔJԽqʱB-N̤rSt+|$e`q.+TwDh"C kt9[Ն ҉PX̵;\Ӫ&yi@of6$g >zJӁUyΩa፴}٠uE*|Vxn]~:f>ymp(ZD^l¤ }t,2,(djS9IEXRDRve,2# y|8i\PY& Z1KL C)EA<$& ơAPmuy<"G Ip9em `š}vϐ_T>Z)c4=Ck8с8e (%AG[R!ɠ}TW]*ЫnA[ x3w;r=[ߍ2iu}og fPy,ӶReU=n'tTUy1==C6냉Y!Xf"R㖳,.[L2gI\$. ; '$x9ٯsֆ :^-nD{|KNŧ?/@!#\Zop)Zo|y"!*ѕv^x;[* A$}':QYBr"sU,@О#A:O1tc3(*=M+c=,MwxBNܕvj2,F 1y0Z[KIG\*q3a,URS+P{hήtZlgL^=Ujjj-p0] >x0veL{B `mZZwo:7^ki<!L۴ӃEKbI/P1[,ϫZzɗ,=YulCWz?{WƭIC0@nqE{e7Hrdq^,,K#{&N 5<|x\7?o,D`|O.M2#lv`QF2AfnT^hIۻx*Q<" dZNM(JEk)ފH -LJTbV]6gHP rN )DHqHV4/;0vgOݧ+տnO|~}i=rM)V&v:(V2UβXj>#-w`iNjFa;P\̺ߵ!ʘ1""BiNZA"p3e4Ok2::5]}Kd)>./WgۨdKpsvdi[>S~d-'z}(U!;e_rZG)'+B(XaOL&9J( Ŵ!;eFCTBRJ*9R$mSqωJAos2(/F_fәv&r ȅ'OCN ;7܋1N/8?mEƟ_~>?Ϳu6E(h^2Yi]zICI9K-cT{>P!JXa-l ret\Du<"maPƎYG5؝Yei+&%/dt$ 8j!r,ȵ`BwʇEc --d %U5)h EÊ#sC*:Fuẇy[~sV/ؙ|ʈcF##KAAay,cὟ BD!`%tAQQ"Y+ULH!fMn1[:üq}̋U>uv&%ȋ#/,f,Ʌd%!dͪC˨$@8)x7sWq(ڎPU=ApυELޏ$u0_CUH ɫrV](]zˢj"IgmZo~ׅr8:'%V/i> .?VXVǡ\,^x9C:kՁ[p_¿wK'@ to)ݷBMo;noؘV!|&wlwӯ_Ub&]^%V:Y0g~=--?`/?OS_{4i'n>qz]xR#晗u[ƜY{,73Y]4>| ؠؠ.8j4Enf֔q>6bNܴ7wg)k^ ~9mމ\?.T2SX@EI )Hpx٥Ў'.tvQ)TIK`נ0b *rR9{;U`E`5Tʜ$)Ρd M=qk1bc`iׁB/ {Vt>p.{Kzr'e͕[׻~k?TokyƆ[7ܺm,Z=쁛Alo-n/wi͗[F0q ynsky.o˝dKC9ע龵ݜv'udtd3SN6c;%Pms^ͩ.Q$$UH y xL1|KwK{T\`_z]LrXY*:#H IP t /bJ":ۚ kexmД`(+]BF' [VIuhXgpHڧUUF#TփvP_)>~Q])1ˌhǜ-E-- &|ѫ5l6w_T3+x\goPgIruњ 2z))JTСw9B$G^P(^YP`, 뉭Px:LP4!申%cBV;: Ξ]W_e ]ʗR*Ճ@r!@l 5 ";Y-OTEȹX@KoEDO?DY@embNC_D19d.-NrfE+O)*̪QLՊ>?voN+Mț"SD ) %r|!y,^h"g3tCyrtNPe˘$sJ)kbr"^x 67uֳd"h,a+")FS eQ![-T t6eB{ꊐztR ؅*Uxđحp-*DtY )ݍ>Xu-v[7Q;z&Ƞn@*`;d JbV'&( ^`b2g"<1 x^slf C͟A-6hRA t4 %E@!w!JDvp.[!93 [kNfOn:Ֆw;x'7 RBԓX~|1sfRO=~=6r˂ݻ'ۦ~17]|| b~{\b&'fϧg^k}1o-h>a]ͫ㒧 _]DoZ1bDU,4DЍ*1*}rgCDi%x{RRpFTdJ5Y-+ ZRG|Nh+ `1:2HGD@+Tv,ԧvigǛKOvx.4~-ᙎGէKB8sէ%E&wg˛vno>~zEkanXq`ګfwNCR;T=!zC2YSRtzM`J T"rUqI|?:%]*J'x-~ly%ZiOAfEIٙ"УN! 2a9Uv=&ܧ0 pj mN,2.m!R Ro,j].wmfqnČ.}՜N_@}8mT{u>AxJYq93p:#64pi0t%a;]U`Gzt֙'--{,]U*'^+F`骢ԣvJh6b8õJ*Z}HW/SC2X&`0hmﵫҙ^ ]irՀ[CW ]U*J#]Dr6޲ltgSsrEReB6ބoA=<-tB8Vo&&i6hΐg21b3^6O/^ScQX X"j׼K̻Gt;!f13+ßZٗXndt]`I_(_+g$3i֓\sBS CJYM$IIb,뒑MX5AǓ!D`38 ;$)p8NSk=}*J_DK:+DW0ÕB*S8*J#]@"紃68p`誢޿cV>]=޾f_׮؈Bx\IKWC_teAWvC^2h@tU7p ѪtUQZ9 +B]1`)`7h;]UGztRB*` *Z'NWHW/Pea@tŀQ`8hIڜǣ1JzHgW X]Uz0G-dз7#]= ]9eM;Ioxm-VBk5f~u OEy[-VDI/I=wnϛrYL=hָQvf4@mnD˖O&N?Vwέ%@XoدV-j6SlMܓEo&LWvI(*=#낉{0,[a? 0uQgos|`2(L~whklrde吂 Eh*YRk H%Bht1䥉+KD>yYk/%$klK黃vU4eym׺asv+ /Nw-qIWti2#33fluatY۱|J05a}= Y 6Q~xeFzWWHFhj. \CTwL>[}N>{UߟzRp CH9gJ7m[Kݶtxa'P~ Ҿ&oғҠӅ;1MGAy͔{(-T۷!Ҝ޷^ qzCd"w;dBI9A\13gIC>Ѓb4Nk1o!cl<x Eč)VK1!w@E3G{zDA\1跈c ?r:TcZ' cCJ L4Vwϙly1ooMS#%Cyws{"Ј&Wrf\;+m!Sidw8Eu ̵jpoL|cXFys/U(Q'K+cF ۘ`M DmMw/WN9?BH6!z6z@nѾ\k.$jCf0L,ٺhbύ Vn)YcUdB{H #KE ^K4ѷ3f0YS"e_hbn9dS. cEo1x&{v  h A@ka&h&~5G{-%ʅ*C ]_@28!d^PX[ӸZ q'YeQ\Zj0=Y{mGe7R`PY{єـnM:Tm&Om}{2:8:2mmX|.H1$؉qokm(WJ%c 7 c l@{ W2%G)#V? BiZ_P:$LX^p[L%y%w#w(!Ȯ6moC AŚE kǤ< !.A -i<8I0<'!'w$ l3<)@@ fet=5T g@QŁ.єA)D^tdDwF=AO/:: C;w*Zq&.ǽ(i E>CdΊH5hCBb|r Zѣp`]l,@u6d@HL,=am1LJ lLcBEєB>3e.-ڻQKM3f3X@Q󈦉r"'d s /v7ŶۏO,tzêʏͶ]`A̔ZXI| (#KyD*}vrV$FHVB8Z*3d`ayR@ q?X/{kxQu^ 4M6L輶a>DǩZ#.s, 1|HDŹu <oĝ 1s`Jrj@[3[#Vl#xe;&{k _6H j!r_^}Xw } S|LQb)V,XAX;KfEȋE}L!-.x W5MLpM?k ) `kh?XhX -ChW׊1;} @'^B{EՌxX7Qgeσ_$$#9,rR:׀7r`m,JgQ4#'Jkh@2?AjDaoo;<>æEeXJ)hꬲ-flg'ZZu&7?$gHY MSLkdg g֊oTʽ$/޺ཚ"{BZfF[Hn\&` ྠgǷHw3raѽ6%`#gp nW3XmwiN/q&]A!n~zJ F/Ƶ!ih0.ӿBgӪGŬfZkIlZ, Mʌ<͝ÁwâD9LH{ɵ5BGsݢ]]C;X Cʘ@-WtiҳJg0zbqh!>9?}Ţ>'1|p#5r{/b2oS3~;(V!P>нݯ߂6G~9OCهo^p}~>ոOwݾyd۽=}W 7whvqsՐz'۽𭜝śM}`?+uԽ|1?dW]~M62N2kjx-+볫Ӹ<O6?{OjCv^ƛplzS9;1z*k;M=ebnh)X:aGWGp> \A7 TW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \i1HoqE+&Zsk \ =p^ h4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4pڎ \Ė\gWZo>pd^b ̰4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJWՓD()p)'p3a5+}x+L|q4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW4p+ \iJW~`pFhs+GK1/ncxæI="ka5= pW>࿍^B>M=yZ]0^ ] \ ZNW2 +gҊJ?9+h-t%h_].!g ]/d9GtututuL b= JURکw++&J++?Еw"wt%(]Rzt VDW,?sj? m4NW2)]DboE++6o^EW6c+Ld^ ]y2!GO+WCW4JPZt*'O޶ |'{NoKy+\71Ļ6b?ˋ뛗v&=~KufCIc+JFKhw䢏+ijhZaZhZ_bݟDMK`tӳ/ݴoy?j݅l>G!fKǦ7(To>>Ʊ_]׷??j{3f+l&cϧ=~Z~8>mWl?ܳm:k UˍP8|w68?⸜4IW[!g6WΛp#^՛w v2|۷9Xn&sm5oHGz,Y~,8&9wV+fu-;ӷ9z޶l#]5^SO?~lsu%m.}nef7F5+r4y=FēG(:h{z˽(] `]7MGiJPf*HiMft=_ͦ)A (]_8a"tu`~n: nxf3xLtuJKG !@WYkMӽ3NWZJ>u@t%(-tЕR2++/O +ky-t%h;] JEUN) Q8{J9ۘ%:AJn3q}yڮϷ{7}/tcn]K25޲vM[cjF6tsXYVcs?){nU7@#K8j;Fn$#g8r9 /lro|\XIdw*I3䨱 ţnu$*$53j9TD&|:F\Yd]`ӎuErnWVD"0WN;!\9P 2m'LjUAdRȯWz`̠]=uS9\⪧Z9K.{uyWs'\R`ԗxy)HiW$wP{?ݨP"BL:B\qcs +ɵ\Z\"M/&\ULp%coSlƺ"zTVxw՗NrHfbWV1[;H%L1J9^~RNj HHor2=(q師syɺ?a~Niɦ5J]Kz[ӝ|7hW0ct+-O:F´4 Z2O{wwGTlgq˖O(Xr9oœ#ԓ'w!•EN''ƶ+T jӴ11n`kڱHk&i պSRҿ\UhX7uK{^r5c#;rqőV%\`+u\Z v\R'\}\ #87 k HoVJ>YWLj+i!\/P"c/v:{tN:B\) ѷ'6+T ֎+R W3ַA8u&ՙ\3eZףlТ\i:;zL+MjܠG:s^jv5 +<>u~mNm;g02Woss+?_w(o 7wcϢA?ӺrxohԸf<|n,swi6b1>o1ópX-ouϪ1^/7􁮡w~2lz* r*Ys7 Di 9i꼃t9wŁ}mifywOyM0oٶX XuIV'kH |>YtXp&7lgyh7BUG?ým,ް;j=m?v6_[ίyvW:h"_=rCj>k9 WFktIJ Lt)ht*5.i]xwqow lismXߙ7Wחa2)Ft<_<2?(V_C^{kɪ]v5pխiFz^ MH y'6kC*wx#P&aREvҷOWx?6d:Igt[m6WNSy>m?z#z7f>mL 9W&._ `!%9q&C9pVȎɴ{Ĵq1h}:B9@5";P~,-Ç=(!:VUETr|L'l mAdKoǬ+\,.1؃ya~:.f5uJm90Dɂ},2脆 vw,gܪhmpEřMNFWH e$3@ AwY`LqVR+d$/ęB2-Isi7 #% L쑸|rq9vbC梞8qI8b&hJ:cJVh @ Ԟ(eCyb5PPyh+ nZ~\l4i[or`>IzU|bU]r{w?~BçxiW̼1r>̦ޞ|ufL9Z{jNk^E]djQuF,uBw1EVICyۜcCTʃ^L\H "+%59l`5dh,me A9'eAERpT q0(-eʭ9x>B`VXkWyE+ ˷D3] tny83#cߓ# W~Ĭc'qݯK ":T0xJWцB 툌>T_W9bu'hcoꐸE>@Ķ0*} ]F p}َHLHiN!~/$=UZ-NS4~$IzzO/8-"$^hoU_1㥶[IIcB,Xd9WF_`M=}۷gRc[sݜ5"9?wR3Kx'RsBZmC)㻺/f]M U~~w1w u(53A5׵jTF;%IYE_hM$SQBd÷mRtipQfItٛ"#HPSL™*瀲_RIJwk$wR.-f[6HEaj ]kM33U%ЪxS#9QR]:% w٥\L4sf4snp4s0 9i^GŐT(\q@E0=ȉ bjKeMY(s"ASqYcv ,dRڨ12Jsu/fGՇW7妰[p78AȀK'7 itQ̈ÙL%--sH36e36B"H(rƂG;o:KWE3*IK TBykLD ʂdEfrXwlʤE;l11+jlg9zYӫ]ʗh@1Al'$m% oDJUQN>GhiEW?}f2~I0=j] 93¬V3CR7IoLn8ğo%%E*eD.l*P1$%ȨUeuDf̮qr?tȮᛙ]ÏT߄A04O B ҈O6@mv:e@"VJ'ˈl_fPɫC1f*24;<حX2BR E+9s!!e9)DY15 `. J4"q3jh`)qLC:2k ۉyHyd@@V b3Ԗ4w߳mD;Ô\!u5ݓ N|',Eqb((G]X(œr- -k9^\Jy~]ؼm8Ч5Yg`ʙB9_*c xIhKs2yˁ"RBf*d^9!HYvI'K֤V X$;J69tVs6wJ0UZ%'ǚ_p4SNCS#$aJZb @RPT@|D &Kķ k]N~R{d3\c"|jnWzfL̝)1Eq&.[DL-f-+b8[]r0c.u)OA6, 2{i{_`;ݲLxBfY{|UsR"8M\(C-IVP#:?m 1#fobE+G!)zrj8Hd94@ARsQEʡ(^!"98#ap!Кs`{Cg_P 7ܗ_/_UTkOy#D]^hQ<4U"1f Z)m(A}TV9@4ڞ6N^X&{)A0u .9+.+9Hkt2*Q$Z6g>@ l{xtxWceqY)tΣzG`X^!ԻiSŕ4*=f0UJ=6xZ扰äX~Kce,G]Ȭ"FŘN^ãEL}ǵ<ߝ<3h3\kD)1L S 9Gr0ci~Sk>@q~n⺹}ξ_^Zi}!MeV;ZJQRDW2 $j1##T]72* R6IECɘJHaz CQ[PXIDhs$ KO,"}EiԘ))`_&Cj<8tGWkz0g$^^=ǻzc5 ;Yť菖[P&j+!2N's:VyFshf-CC&P:R '^ x5u*yacYkcB],1] > S:(rM1_u?*ld oBm|bz_f洘. U-qi\fq%wO“ƶvy223A^/o&e:˳euY7ɞd^'.KVa$ _:~@qYf?QR{q3>O)m,XttCM6"\ . ,@sp~XGsRCr/apϧCq5juyDU9A.iۦ)=WU=,G/^TlX-fv%J_OY秿ZǸg/?_mKv\—Ŗ+VhL{Pkm[f@٫c ZmB/!?D/OO5ּuZĖQhn+g-i<4o ƿ w;8r sݴ ՝g_Z:ǟDſlpLW)J0_9>iOPri^`f|h-c/t e` O,*܄;$Z&=s>s`*jEJ޿Vcx f8_S6F@~u~o֝I0 "1m۬z,7W7C/jw2!2Ie]RzY}" T-&BQ;#" 'M4z4|`B 5ziwxM=TI%d֕W? brubۏ.OlVȣ HǤs#W50nP:Ƒ+f2G4ƅUEԳ1OiCt훣d~ @}5_tXrC9.*Nx:z$IJ (FQH4ι ghjսZ쬃Y()UZKPJ>$(ZG[)XX1IFYT F_`FُnCw<k;RR5 ]LFV7K111^uw&ɚw&WS¹i P2]G/z%Fz8ljh'v"z,(&PpK1>RTڔbb O%6[~ ,mґbx+== ~~]IR݋a-\RQgZ{:_J~ LI685E*YEQltwթ~Kg3/ ygBJ0fq&aFU58VGtlUuttJr㥐-bW\|\GWt!?8N"a0  +:NQr]6qZFm-#lwf2bc{PM>ýrFP,5)XQ"dKP\Y:} ʦ;.2loW.tV$CD͓NpR7ߝfN(>rD(rub(-V|(igqI$D9Qt _ V)FK5=`2y ,Eat(Y"+ ESXG*:k87ЈYKU5;* pV 8wvf[YREf^sҦF/[>,9cm( ^EC{iDiDwGXL*p>&R*'roqg"mW6 g:b8Qk*ddge1_#xngRn{RkNs5g.NYj&Q;DxQ/X|hTyZqw/ۻ0os3ǹvs{(тopf uofq4tAa3.8xꎊ'`NḌt}+e.M4J!~3blUM5Fu7WZͲҦ<Ji2weÓsҭ^'Z\UʶZ]z^\G: >jLJSif2BRSn!?w &O@?zo^_~ͫ/_|Ņ}^~%,gi0 (`?hÜa\q۵R|qzxv.N_)ba5UǩI4ݵ3]µygE얇̍÷U5W--SV[=5vM^Q?6ƓRˋuVKtOUo^11'# :XI 9#|m2!j]bfR(iHzaN.V{ ]?팗֨F|Fpy$Sŋ\r D sQD~Mg+k:u{[2mt]!+;;o躕jmq8CۣdlZ{/+ZE4b/^99ML=:]ҏLWk:8tu/ݢ+}{t9g. 28UNN&v[A7/xQoonC9VXi75/A5mxE7|ڡ8" # H1pOF슦?ysG{8ypxԴۯkTQ bQ>P5P7FBHZRݜ/(?6wϚ^;n&r67rptp7f0_xH$]c9v  ^aQ+EQ+-[:my|;SN^ 3ʪS *JFKAѬȶ=;ß矞7'>D$r-ૌQFw,ipgf=AgM2]0'p ]U:]UK&vOWO)GtU[\x_誢tUQ+Ŭ4}Df+O n*Zv*JC{ztR@6s,i"k֫3Q07ϗ{@лm-r^] ׊JtUQ.MЕͨGtU;;NWNW{ЕYs!B߅p?u/=.q~(厝4+MOz{C HzCW-\hu(+!T#`"p ]UvRq'HWR fY 9Ul_誢5f骢fOWOH*]wUj ]U:]azOWOb62 7tU*h-u(޻zt-.S.gBswU^y\ۥ;qr$NUœyYف%\ s"3a*$ףWrpcbnVy3{* zBpD6U^bHO<:ȵ.=\ )'τ hwHE+w~wHE.StY I} *֛ V]+KF>`mu/tfR7=Er8-Yp~QtUQ:@Wצ"T UɋNӽEp4j1G~旃ycv ? s/FBṿU E;ޛR+o-ΪwwixE b4װO}N߯]QIkmW[䵊o**kW_U7S+I!mq7]k;Dq>R-Vu9\^?--';G\p\t =M!l&5%g?5MKy`΁n0-D~#kg.{Aʖ\WYB'&PƇDLRbX>nZ1ƎU͕hL4"uLL.,A Ø ]2bMrh'tXLXDkr9CBkZ`9adH8QH.Kw*h^]ks+5.*Y*E8g(*2W2RِSό砓- q)!KwAHZA1E!1ڡodHeJ 3m@C9z)M@ 17hmb׍2IBBDt0`0 N`!f 4JIJDW'f" fsa-hH!<@#1G?{#&3k\D%57 k ,Wچ@.,|bL$n}\ Wܰ i|8xeyh"eEk\qNg9ds%yz8: k!c@Jd~ښG\ 62"X &$Z$$H)U) :$' 9h%X\n.FĒG! Vx'ռ"Mr3hL0RĴO+2SS,g >sŤ / eSx4&µrrXH;Sn,11% xQ.ãkAO-| >*̫=9&V%O!'/tЩv%* E˃!Э¤1k5B!+NDIϐBu/l##*0A},SFi2a S*!z/t(mʤ|Nț*|"U`Q.HYMVbQI2! ( gl`&d x&XZB0qϋ 4~^f6XXØ˥iv=z]'Iš;t`{Ces C83F0ʂmp$( $^ D q]h"1A\*MNV˚z߅LP6yd-$\|4x  A#/m*҇-hf%^E Irs20>tLqΠ&j  :_`JRE"&LԴPyE(}R(T)0*k4lh=dhEE YRڈ( 2hG[Ҷ`Jr@ VϓţbiVn2JR$ }'oVxd튤ΡN‚.&jHmE1+tmpe,;KnCz9pi!/90(T&o%!jUWH Hv2 jwKYZ%tMކG L3ꀃ2Cj )N6GM&ڜ׃jFP4X%YiQix.hIM@GhYcvs6]g$1,6QMC ;썉ѪZdx,p0",0Lj G?U3ۉTICXsC M-6C:hˢ9K4YF%Gj3K o^5 RYUۭ ^pV֍BZz7-I. ft*2cƐ%V&n:Hc=[Nt$6l4U5͙ݞ䔥kڢ+Z`8FL-MM>0;5,:Z4kHk sQS9%  bB9|о>5ks8n0𐗨ś !u>fTs]`7N((j =ؖ`5\߰UaݲIY!C|^LWĊ"PD W}\' WF#0Ftل^qY0S!;B4[1<*ᣚIUk00g r"[qCڍ7FPٗ9ϡ Yk8m@`viY kCP/jC bMčrFSd9 N²-C3Hb~' `vNAl!&\Ko!b/cdiVBM%|$"Ȱ9ubɡ.тPklץkT"C`Tv"#DoaLn`j?.zMj-nN狥ڰb%Ic\׮])&;1bFWe׾3@fg\bV t!Pg܀_68nvӔϠAVTt_o1e2?_]\v08:0qtvtxȅ}yvh 9#dz]Qov9m2?α?_xPBMs# X}"б@W!}Q*xM@ D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$%Dy[qxRp>pG V"v*xy֮-B){ >~>0%Օmÿ|7x˛Wc{p>/8%(q`Uhh~|p>K;߭i6EW?b*AVRK~pph> ;Lkd7צvVz/zU|m[rwo|xv| /l:o'_L'ϮwǓAz ~8+ z&[Om|F.\S]u<*;/f~v~.m%fy9ĚVp/֮rRl/8 />\'ߔs}Jraօ1ڥuj<Hl5&pNBbE*MbQyXOXk|pC:痒g[g|xJ[OgΗ>6N7C{#9z9t~q0t,y%#vnkzm-ؗZ-ZwV;okq;^u P] ׏3EmO'*pf^ *rӦ ٳ[هi,+y꾮n#rffA_+ku[rBbs+Y}Vk9sm #u:!V\ n;dXu~rzi^{{9l8O/.qnmHai,'\x t09_Snn\?lZFqN^7pٽF}],n3ׯdZW1s['?`(d9ɮm)[0]1yW\*Um)ߘmܮ7!vG=*^bkק6XG)~׶{iʑeB/\ZWLycHʋUCl2,eˡ;$'12|AY"QPS<:[^yhY*~D/ho?|{C G/Ձ~Zj,*5^3% e]Tv Ɔd\Tk;jڲ?qvXp(pW33hѶFٶ.um$EVmDr䰫دkq6X8gO0.ɖ-oXmTcH^0S4s*mbj9Zt9S]䙆Cy?.m{ɢM*YXG毬պV6(BsjHj &|KߑOz]h#gWXf(5%EԞ+LR1TsiHTxm7'J142L☕FH6NE5݆"9OsrFm{[K9hO-ڎs*nX@ˆ 6[3:b{w-.h 7# pg2"61P B۳ۓ$!޴dDz)U4Ѫ^h d^r2ma0US*FYV"hR* f{.I)ـkIB:{:^9{ ѪB[T|R 3r^5|"xktR_ o8:z%uۓ^a;5ڶ'өn~v:B$;`ȢNI;os?ŷ@Hs쬶KeO#TJmh;Nh"i&mzI%F.NB:ƀ{\;x=r2t*1ftDvr 5t F^h6gnS^C;GNobUKWN Ϙq9l&eҦ% :7c,'zqCjOŭ Gšr.A|y=u9w[ ]>K g5'ⱹlCݻWJRo+ -wk<+FۛIfW"s]aĂ_8xeYNP/X0f4Fvooy?g\*S ?gjvQ(c\huU,xfxtUNql ҃XDr>c):Fq&eUj1Wuc»OWEC@XGߟfW %7E΋n9ֶśQ5l5zGG8hF%Ȧ\^sF4ol.UĦo@|;Dž[5WR*<KckȏO|"7NsW4])0usv9LuU-<;7eViZf A*pb E"Zތ鵽v,6frl j >0Ia:!TAY;-u^9Q1o[cf"+w"6XWk_#˚î1m۴#9t}pxSD\:BKBEd L$HNOG ʁ 22>1W/$<z8 aJr<@(F  2 {\҆Ȑ5w ,0{zytZTΤƞ)H%7rDϸegᖢ;|lO:Pt4̺R@)Z UJDvp xAI`+ P11J-B;cV|l/A  4.ng'RRƹteD93M$+U$PO-^L+MxF%tyCB2(}U$ ?ՋU&.&JcO1e=V(亼)ZبUVK(sc|)S@*Zr2ЇB6@qqR޻l)iʥJ喈)2"7ER{d77:D. Kb2$`:ntYePhJ2B'JSI_5h7 `ȧWS> ٮ F0>%6M {멊yİ%)@SR#갵{X-D_ΙoMuX08@xe >%H U 5(i7,!W2&/)b)bR| 5:\h8AxgrPV.AE$j|>gLђ(zƤbE[rLY 9bW )$H7󫻫^z%vLˍ%n:*wۯC%G*e0R؄|vF#cQAJ'SX;xbl|mフsk^f0Y.yJX L+`ZKІ:‰dMFZƒ S^PpOa M AS3._ā&i-71FFa>$2,*>i *__ V#mM;%4 Q`օ(*i+D#L$JRԚȸt`~ihgb,(C ' es⭴A@I2""\io4rgh% H6{Y.c|s\K;K>ֱc%/rշQ)`OX6pcr3?cLܺѯ4JZ:5Ob  ֽhqKl! ; GZqNy8rU~ruT-PĖo8]ރB; Vyn#REW+q8-:Lul^ #NUqj!R򯶾Z G%ZBGaR0]cH(ksx{n0Bō>x6]q!~k޼_|{8kQcͻ14MƧgr{=kƑb:|.N`\Ւ&- fXg3\ 6tV'&Tj9ubK,p"Z]vξH[yZ^m-OQW834q ;$\WEhG?>#4lKӽ];kL]h[Zu<"İ:N7>^XSV&$>W#1@x퍢BϩH.L>(FQ"8ǬQl$*It66,KYsχC48]l7ZƀX7h|Ќr:'LAXkb L3úךtw=ݕV+X5<ێUw۩ZEO<$Dbe%V"S::l` papx1!'-_f)h<),Gg{HP!5gr];*xG-հD#rm"4G)S#e% ̚}9+kAvb#|P"|aK㦓ġmћhl6-AQeC+u\<\ͯ]6ۢ|Z5(^cX/.GE:R`n<ףWcq5H6q߂|Ov~9ףWb8ܤΕ݌OUozD|M¢&j( 2<젉~>_rHNhvF&nbҰJN` cBɤHsߨ(!j %#AQ/.yg+Y.KV}H5y*?|:RPePɳcn9`޾<ݕF?/ !%GD3V "9cd:0t54rcv̙.=v;ÿG5h{ qYOZl*pDit٧He06#ɑk}7d6dքy"v?ir Uxrzi{qp\-n86?UnsMԡ AsJgۄlxPA%@^`7&!As%Q.eW6:,0#:X+փ!rcJL@V"r)u,1q6[zpA{mBm.*pL%L^pa2sb B-*H@ OAڕr8+c`E^s !6d"cA@G^h]RA9z7.*CbNcv#nIf(dOeqުrM)^Wm\c%kxbQ'%Ԟ4eHJ8#%%AYH[RD!m!67~3<> j8/=s>!# HWӖYrM4:kf;_s^ CS ha%w]-̧eǿ|z=|7r1]Hl*K\Fd,J{$@S(;m"SiYΝxdo`vf2sQ/~`xL 9BIJ z;<-D/M!BcʎZ %Y/|“n5Υl8ČD#|WK~% EWW)v E<-Vi );jKxU90MpuW lXusn\+x@+6ꫡ+KߥB.VftEմi8/T( GJVTr~Grj zo Y j^*^FOןy(έ9 Ϝd_^9[5 2_jpDN{Ե= |@VU2 (s4(H\x,(HZ>Ed19"RdGWE\ˏHZUr9J W4S/QR鷝6Lږ;/ɚIN{Hp:a`3%ϘO7iHI9;*+f:B)<*Fk 7޾߂1}C_OV|Ze)^ )ȔZq41;~Z6vݻ>nP_{V34hPMF^~K8ԯ-rnޱW;j1袴9)8>m?T毝pH^/*/oRӯ?} *$*,Vd@U8D{WUS"줱Rۈn|$i<_\"3+c3gue]̕`ֺ0;YrE`{Xpbn%˳hBkt9k͢Is-8zѣȴ*Og㐡\;xϰx 4m~O//%Z1-jt״<*j ȣB 3 {8ָ\D!N92Llwà,u $!p>&PYP)OB"QV-{z?wbDAexV}%TϢH 񎘳!֭̌??-gx0of:o~~ݹtH3>uf27 iVG]oQsԏuėq=.'ypBeU=g\2q2d'pt{OԳYr,CsK%fWN k7~q놋堋A_|Qyb3_ R_j:;;\ O?$Њ0ttJQuMЊ6EFk'.iH5=3o;4+ݙ/w'?^^O.g/Ъr!̕^tlۥÑNzR:'[ n/Vdʖ@=ժff,3+YV״hcj|*f@o^u'k:g]٪`[wrU+E#HiXc0/ \Sg;6N <~^ԕj҉_uǝ^:?*~yx ./^v?LRMd,?7৵fN7,Z)v~穴% -[qY؉WaVaef l>#747oZ"lt"6NVksn5x[^@8/;lFm_nGH|6gn#Dڣ栃JZps¢NFhy'&Fڇ spC

js.ov<7^M8}kG{ێTL)x10:]!"*Jz cX&!&2f_InT ^չ3PSC;Fnlrx={9Rv&1&QhۤDnv5K7G'^㜰M/m3,Oё}gmD%djG,wdRd@o[5(c@EsCy+Icݖo$-qlzw6L3<ήu/րS8 SC_g,Gj6p6 rsݻͯ zQdȹ:E}kux a:07!ʕ 4̳ ;ϒ<рyUDXg*tUa+i- ,c9ƹ炖JEi\Flj_2j *Cʒe tj@$2qLef>oL э m5> nB a&nםfoۙCrZf)L/QS&S6sIXXv`ʑJ][S#;+~jtll˜8!)SiЧO\.LS}1.ReL兡Q,r1IdV:>hU%8yst1xs} N#].c ,$NFX,-q)1?Lg"]{u>SZ;Ǧh]b[60vҬc+;۫H#KHl,V\vAՎ\l}򸒟>~}WkPn6\>0MfXJO !"oB`vs)薇|/fL0'~^m\9ij4oV{g_nqpa9kσoށPRXu623Cʊ /wl<(A1bp JhRf1@2&$LrH3m*T u*UY,fJ{nY@$ H"k ]CC5rv+I8h~llt}&sBbJ4ZA#OS"t7P۞6%iv 8$CDU`QL5橢y \i\Q(7^ꍉpUƼ3X6DiL "4Q[N AH$p2|bajRUVXB "YewwH?Jx/ z:]h?X|֚ dT|PL:H+r&_u?C07sb ٖ^/_eW*P[fN $t:BQg=; wpt[ A#&M,P7j7{ߥ ~X6Mܼ}ˍV|wvgN]'˚_6m[lƳO2pxFv=c I e6-/}:3YXO`J#dA^Ǎ>?x! U:4pω+=j *h]'DgJ+HK!G~+Fg-NJJp`iVRU>42.ٜ- c" O6[OzKATu)NkǧՏxYB'w0Ktb,QmYg},MDSHjW-Qf(@{~a@1UEC[J|nl7]^T]XYͲK-ĒpA7JD.VIF v]d쉖]}k Z o}uE iuvVA ɳHxI&ads'%A$.e~,{ Ҋh`t 8ˀꇱXIȂ.&rH[hwʹ"sB!d ܆N:-*16Id~p#^ÑҼDjD@}H{PU- \2vJ26Tk~nYA;V nq r.;j^vݴW/u:D۫*ګ /lӋ{oُf$8f\0&JÙ !nNଃYыN.^dG BX.|"Mh$$YL23jICTJ҉i4Cc*FB }JXWK_~W5B!\iFOi͹ $cR./ʹNV<%ٕ;hi%=dC^/|C+Ͻ(vQԏow۸cF1w<$b>ʲYZQ9|1 Q9x\F%曨oЂ-yӟk>ϽuHzxA7몏t5Z@ 8fJ>{w/IOB:˽rta5@= н~_@2:Xzo*;^ln~zovUzW7c| aܖ]n+@;R>o|VMP,4\pL,]$ɕ.r'TȝDx`hyt'e*v`'mKNH;S[wbK݌Zy!UN(g&$$$fK&Z|GB ށbf ͫjFl16&}\*XÍ k2)ښ9{d$V*ta58TʺPz5-n 29^^l݀si1/r@|klb٥d0X":˭) d"\10AzoR؄e(^dB6ɨhXfY(ʛ0d]z٭tk&hZqV`f?Dn ;W lCt)/4KqP@$0kcCNsE2!CȊN-@@!H:ȨFBe}X5W1gE1FjD]Y#A#q OZ0^EK (h)ezCI'rHssnցw1cƨ>%X%r3br ѨPVU*n5b5r\@zՑ||ոPh*E3A/nB&+Um4LgPyk3j/9=\oR"*FCL6!oLt6\&C>ETl~>>L[pwbQS99(؇A[ove Bn^]/8",Xs * j}-MVd\hn9 }Wl'|"dC linnXXyNV9λV85mA/[+iLڀ1"ER㚨{ԃiH6*Sdk8Z8 ʻ!8ëZf8d(;B?Bnp$x}͂6nqԉgM&7~hae?e[Ɠ_ ޠm:NDT)z#Ngvѓ֣q0Dھ6Vokpl5ZPglaM{B%7xreu+pN%k;hDO?;DXOqKM>ٗ|/6w9y¾lh(e uzL;Sk J6(ߦ(z;DD@I =t`sgN%˓OR(ɧ=XG \em٬ zGE4>LM3룥H{w|qyǛnBۼ)"FA Wk[;ixjLaU}'L[w2+c_CbopKTje1Nctl;jvڈoGI^ӭd]: ]}E i5DQ঒{3sx('7~|UP mŐq _oQԦX#vKpj08_H5\\:PU"V\!0BzW.攠̵)7׫,nx'*!y Bju(o^//WFz>st#H<% ܞ{2x[㓡b0E{?@)^2k=K:Ew\J ջG\)ek]H\ZW+D+e9sW,|)bZ+Vi+WPHCQ XRpEj0UBG\anJv 9伄ށj{h߈rKX WU睠eAU7`Tbw^V1X,d\%E)b"Vi*WN&IW X.32IdVTk}I+H1弢gW֊E-{Wcjnj\9&8zTF *vzi \`:Rpjjq+)  vԚSi*W* \`L1"FRpEjҹ4:*Wh*ɻbzY HGFx:XsW+m/ W,TNJU;Xen+^WFX\KJv!,jƒKF[z1Ya4+$\\4Z;XWޢ \`[`bVZ/dWWe6p;1`p\yq5N-δrJpjתNnH7H] XUڊ}8+/W,WRpj+Vr֚$e\mKuI}z  Z+k Z#XFԊ}IޕZrXSJsTjqen/  {A1]21Z,F ip4%re(N4dJL4J.&{yZ3E @19yjMNi}9h}A"ڔēΝ~}g +Vu|q٧*W,bpr}1V͵fJWWz˪B %X˹xfޞ{&UC\i \`xt}S*,WVq*7RV\ 3?PRJ=)~p͈r 6eEot{(n.1?*q;QjQV߰~ U?T֦& ϖ}Yd`uMzhLP{_|H{"^Ujge8ۯ[˖'Wq(N4S[s)q}?WWҿmS9=7k]( \ơ^ď X ?ۓ?S|}y{G#]?NPя.q>?:9櫻Ml}O=TO|~ߍ~~WלOjSqmho~6m߭Ώ7"PkD 9HEcע>biUrA+{ ^u'F/UgntN@{>Y,?n1$EkzS#yIo$4u0 ]1Ykv<B=K2.Vm:qcE=éW~p}.T9X_+'W7m=c{yЉ-/V/U]Fe>oHxsw< CjCyg4qCGs<~zr68ݚ͏ ÏciuRo:j=rώPoqt[o7{4B_j={|%?g{ڞ23?Opq풄=;Z.[ 9W%ϋ#[Tx?2PRj70ݭt΋O |e w ~ gָ⼸ gX͊;.,S2%gXqbNnb_k[l2*#74BJY9É1)J/_-MPm9q @?5zЊ DKqPpS޻l)iʥJRRY-- N(s@ ML DL%LS2,Qyz+_NgOw'?>=~N? .Eƈyܬv~mAR%1 2HR:qX/ˠiJ2B'JS 1Z tL(/0K}h$ gJ:2T"n%'(3rT +,k2 /rqݝMѺ^~iZM6T[zlWtn.%0K+ڀ϶Sİ5)Jnӊ#괭;wBcnߙEv"X_$D$yBp+x?X;[66ddyFUdZKІit҆ 'Bk4%":z*< +5C!^q\,贄UB&ir@,h$#OEϢA.DCAgasښ4 C e"4WFHs5nqv=%?L|cfA)L1ddGi29VZ $N4DɃ)L4ň5Q5C3VSTߝ_?+?h({Ε|?շA+OXpHy⷟Wn &?ĢV{'>qW-p}"q FT95FC ;drΦR+)gݝUsTgqdHt(! D"Ku;7Ʌ;\ah.\dy}y Bs $xWL]Dk{BtQJ@, \`j %"|=Y+>F幡mؘSӺX4Ϯ5vQxSUu`*W\Oq /kڮ9GdMV˷~AHH$ۆaX[Ygw(ǓZ? U̻d<[.tu?z8drv~稌<>rmcdCyPIPyMr2TMlX|ɽDDt4&rcx9!s -Df)Y0<b38s;%pZ/, yDD OB Yi>t</%SL(1\pü[`}B?J)f)l۫SCbqtGM@,7(M3H 2hɨ3r+ }Hi gU 鴖^C9b<9ej 0`"& 1e'} ikB)$1g'R0tL(T4ÿNG7h.þ395W⎻ugz 9St[1.0'](7J(!D|6tV NZ2Ü '|0;{gkpV=ݸUq8*nӰy[ ۵e=|,KLdWqX .@T9 Dx0HhK0jE p"eVz8o|+znƆO޵ܦo󷫸;LnFP+t(⸘c/~DFgEm\ZwWx7O>[E}Wp<#7lɼ˖xvzQaڊ@E">[i^Rqjk* k~VL8B5\O8%#X.\27YV!ϛ )M43JJ4ti\ru<&;EVyzaH2k F1R,s6|zЁ\: o"Nl;y۵B](jfݛJξ\" ejj,Iv.T6? < xWu4X*7Y-ZG]™hep#fRtTkdPܡP@\9OKNM&vvzj嫯%șbMQ+zG 9 }:;?n;#u(5N#(]US2IA%oԎ<ڑO츇G__to73s&keX v0$'Qh7wLĘ ctJ1sWȏ*s.!L1S: VI19$ɤW.Y<%d^#E#d^nظW bx`i͕O6z*\0*,V-BjKnkRfB$tnC-ӲҊ!Q@{0#{+F2b {V\|(H FLf+UO5צy8B}`99QM :6Q+GAm$"A3jk"2_n*xR׊$bWK$B%FI㥦WE{RߙԟCC:ZX7E? =EԳ@Qז?M 6 ";K](,M]qg-AD,Dr> c uCqr&eU5~H2=1sH|2.2Q Xÿҳ.uw !(L4\Ս? 2 :$L'6bai5IPm$ERҚ\zB-A⸾GefVӶMɭao|j{O_3q[Gf'ůtui~]-~aʮEM8p|5-o]U:ѕ}sJwc8 K.r3ݜ"m7[~4Y*y9oyKsvs:V [4E |U>h7j!`PI.)ӎJYЬø]xgq}{ ʥIJ#Li~!u8Nj?]kUW?O&ӿ 9F7hwwZ'nqclTw<^Cn6xn{Y AՑIh u8E!ޱٸz6ǛT j0~8^kY콺DgަǼV 61UE6G|3`ųy['˚¯m4+9rƳe zBH*('В|O&gmB"“9(~\=(mxFƍ^X::4P<'^f,8vQrNʜW?hP!ʾ7%/mү^`Ĩӹ#90ʍufnNwuЫ“UTgc~OI=ik-9B>^]^^hnw9঴0 B)h0SJN3Zǹ)-ı1*Nѳ b׫>rZnc9"9;ɍ֮Ucvۭ >kB9St0)!Y 1&TFmi+6 @AZLIc6?1IUKlnY _h9 /DQ b3P$ِ8v f1O(Ȇ6&}K`睶.W0A9 h  ͅ4[v[vl[{m<&.6,2TJkÎKە!JB:\[RB:͉F_U]4J ">$.Vc:cSlj9&`)Lh"8¢4 QUt=+tD }C/mĺ5Z#"v؝_~;D] rƁK0C48yU<掸r̩ꄽ}Q1ף 9D>j1rX)J[_ٓ2 .+!cQMM4 &^]E'{F'Q?7|"|F؄E\rQ' Z,ȿ"ҚrB,'cBŷdJe.ۘƮeay5M U<_˒ ~boo6o#g],~iq}D]/_ڂUax۶v'2T+bퟪ?C jN=.0>;P;j+U4h`u'j4QƝer '7j/[/ U%j8W"T J\6X|jWJ(RVK9F,YB4u1xD'#.lQYF`CwEFˮ\cTm۲p)JaI\>B%s\V[& U@72v3g60ݸJ^/Xh&,|r)kύo^|8_{uNgo!9=]vĶՠ-`rv> M^Rbhi*gXަ5ElK6`/dP&F0`X,FفϮJlW@_ۚJUqgfZ96vFmP{b[)f!&25Rt.DB 䦀L)\tRrNQW3:𬕲%6.Fj6!Ԇ31XlZ Ncj n9;$./l%ݣj?:*s佫 ~k/;$Mُ/`<>q=s=*փ~@y0/dDizi{UL<ߎe){$ (ħf3!sq2[wa~S߄(^Z'Wޱ f]B4C;inQ;5$+3NVV̍hPnj0a?|xϲniB>_ګEb_84mJZ~P0;Y*eK_nt אE7Fn5mv.Cw0\l?u;vv"?Yky͆AO/lvI ֻ9?E[w?Ùo~d[g^o+.ϷobϛkwJ5<'K<]UovUd{ٛcuE: W7_V^h0CU%;Yfq"Ҭ4vY&2ɪ<*Y9}c@.'kʝ4S %^)ࠠEˆ:Q~R,nP-:&ES旓*o]״9!9"{m'j~I%&ZSـ%:Nj2cPRKkalJDPFn;Ɛ<([%C"B<=6)[)AWCW)zSe=Er9c˜tw(AW6мh ިܛ:2O^k[89N6d(3i^~;HG |U1&C Č^mGWx] Edm}{z}iTtzz9:]صq 1-sC\\5 TK(r0 f#K!;ڤ"NBXs/>g[| X:\6aK!f꜋38&!` d|(YU؂^_Ȋ"RtԞSN gzs(4ŧ=6^ f;t<ٲ5f_o5zݓD}zR#o|w3E׷gէa34@?2 bԶūv;O֌\F'|oF$|oF6;[sԶ!1 D0{l*sbuT !w};gM5pEǬ)x[u>f H9&6`8dkjo%^1)zbń֮'\rѤmS1|NO]z-g6Yۭn*j۩=`0ڼp;Nl;YkAL;YW;5vjvj(\VPYkppլ$'+@o<\h`ઙWb-ULp3•!cؕp8pE>j:7v+I ~BBl=js(p%R_jVW?#\U`ઙPYJ4k5\;99;Y$Xl9 o~wM3b=χd.v$Nfv8*ZbZ7.y8+GI}]v_LվMVś^No'ި^j h⧮:J/j1f#kF5ߨ~w;E~;xzo|6`M_mrA_=4[͕31mz+#lo|j6zm3]BRfR7M.QY*~m69[͝Pl8JLg)ͭj3"CM?_ bggl|ȯ^R\{ hMg]mt1? %oujA}kð?]o~rO<|6SZv~QEQ#&ƿ]5/SeǷ{,,cSed}M ^-o)1Kߧv$ jt-&Jwg1ݍ( vtqȷgA,odmbk2f1W[LUA+ieԛ+ctelYS$3b{t %qr@GEږ3/ <$}zo pY{-S=~*X4w9!C}agxݟ/8ʬ!?'DP W\ۗ,ﳓ&ڳ=y֎[OQt/tuN?Pʋ}Gmةk;wU<ҮJ۠ g^~ /tUɾUGUG!]ix~o/t v(zgHW_ M{DWF*\z+CЗkޟ:Jzt'3AQ* ]-'OW@i ]=zg%eu5~lq̷>./y޾۷oIѿ;~'ܛzzP23kkBǷ_N:cǠ:dۿb:߯^@: wE<_^] YV:k,:dYXG xùyGWGή787:mo>b>;MEIb]u˂[;T劸gL _TK¸^V,-P@EuwX=0?;efvp?S1@w; Α}ǏFOWc_AR?>|o{B}9ŷJ֛NdS`kTlYf']??^-!'°A~2>CE-b:~i?]huةpS)D18o,'$A+"?amv:/bbը#b6FŢ۴hq1s1XSƼ1wǂM )nnAvPs1m%T"H1GF@(ZlDj"C㨒Pk,T2&#s%aV(AuIQ7!6O!r5\65QDqM* %%qJR-$R c]3im $ۀؘuk%A1q欪-u{@# ́Q_6UȐ8$ I+E-hC3 kl١tԔRTa*!YShTXZ\dg1f+Yae,DᎨ+ЈxG>W1ןYyc&`-[r 됐 ,RSm58jm؜7G!PLNs͉xNU`U3f-HJQ+= uԾh ~J>́!1;b5} qu4~Qo"&eXK.Ȕ`MDU1/rF#Ֆ%86P*< D̤}1gU\C֪)[Aof\ *7FL.9,R6Q 4i,xsD };a*Vѡ_&eH`zaL9%Xv EA)!Lv+**T(:笂Bzhc~ՠË栌4 <(GNTrUl'  Pn X"RՔ 61(AN]q!j†BXxj+xT$(X%.􋩦k Ki)5j@E >{q{X GW Fe"@plQ<nJ qoܤe*U#SX d!mMhpe 8)L d>8V_Ej&Aij|TU<teHHėG 5Pq1SFۀHow mXs6õ+|[4q\gz$ܑ_hWQLKUT+f12TD#N {0U0y>~߷[m7^p %QNL˃4 ڦB%a!ƫAy@9>m;@ K_N.G[Kw20:xfg8Xq0_<)XX脼ajIR|&2L&zZat^QeΰOkENM}¼&H.:/)nMG͐ B@8_Md!*T?V<{]ru'vm,93|Y͌$QȝLNcw˭?/wSsUQO|X חXk#tdhc٣.d| dŤ>&J,|˅Bd4H H>S{ X4iq!Q }Y@'^AH!%vrm>ȤQ"`Rɚ% <~5t2Б%YkƍEHIIbD&U_(M6@wpDlExZ4CIe 1!,j -fl9Y\j%Wt,I#p֮irpl4r›ڀJn$*x/g(XH IX   xCKp0?Rm0ڍWW@ˣy ,|ZT$J 08AIm%FFJ`-ལIf1T5f?k>n9œGBjA!%›AL8"W!MʌТ|_ÁwCQC^`$C*CO_L;朦y8vh0NUp Ƌ.!\J.!Z.\BpK_4apV]?5UB;!JO:ˣ+4E, 6oz70Yrc`޳ |5 v`}-LW/=MƳX&]qQq㲯=$vݻ[0ohj6ea.ǘ׫Q:gh.}zr .hxZRAT!ަw*biގ ٪9`Ї/J⤪r잂A @%+Y(7yқ\^okIrA_?)ѹWyzprhj*j]EDm{P+kndU u7P|RA-k-vW<5k,UjUED٣keNɒ:pA+\͋R]a&DWU:CWWCWws]I?zɼ=µBu\XJ+Ϧ;݁j+zXo;˧nGǂj;ct巠+Otks@DWb @BWV4J%.JY+@+:]vWS#+S6h:e1gj-[0, Z_ ;A˖dT!|^~Uu=I٠eVkSDcF^K³LJm #n۔6Z_g:i+DŒfHD7̋̄s׍UhRAx1g1rMɴBN `Nj?:Ҝ[UDˎ1X}U9tp̖h] Qz.;Fr3 +\DzE1Vw~0QZFtutDWr ]!ZNW3NtgSmX/͹%\{`Vh5;[[\[tŷ+NtkJ_]!`犡+-vp&:FV`}8wCWWR :u+#+ Fb xӾ-zuc4]=])ﴑ6d ^ ]!Z QjAtuteDWؗ]\u)thU+Di(:JR%+ݓ)3}i #m)nQo)vooׯj c: iŒ;.(f*QjGq 9UAtp+\;BVt4sn.Wu!zD| QZDyJܳ)wރ>\q`>xw;][k@lAWjצXEKW]!`_]\M)th:]!J#k{oمVh0=ϩw&S4K8>#'gar.izv30U7u~*4*]_y5Z2+B&e8⚉}fp'g;wiS +ۧQx|_xb!=K?Hrj;jٯ g~ gLs Ypy5L6f 'OJ{~=avn3Jz`uI)'IoN OOvxچ$sζFͫ˄÷ 7Z- i$2ȥW _O8P&a tV9=B~dOAqV&L^Z_+cI9;4:k"9r65gQX8BlZ]i`WeL6؟m,M!_{-M ?Xo ,e?3țU"P0%o $k%dc]IjzМs5[̫g0@ ۚf+}'4~**^rtm;}u^xߡj˿bywrG`@iAvBVR;**8d 0 4 4] ܩGY ܐIXKg]n>\9dA CTo,B&hCqBZ&ctQ 4dж0"(s1gtwfpMp|;kgRxwQpQUk=J7}ڔ2XS9.Z$[˄L1|8] Ufp/8ͳz\;.b0 &G4̔7p嫘GkŋKpyM` λ6D7yQ7%]I\ʽ']w6ٝGɟ=d] 3R9Ȫ"T3a_97_@s4l73{+=Aq=(zpzTjj}ES׎٘ |)8dgeIe<2@[HSR67,;h*+&kH_fqHmÜMPf%,mm;j*U8 K@q1Bֲ(cBbAiEbcAmFӂ/Ǽ2,.'|=f|r`̥mc.1~ܦJj}[Uܙ'8l\~LؼX{q=qtLEqQGH!mY-1L,kE,%b!jFCTBm 0Gm*.@9Q)mNF +֚9{iUşCuj ՠ {Vd% ga|:{u_ x|e|9P&Ixbdu!"| .s& fT{>P!6`0,d v\t)1-5yZcI01OFkC668hL^*KK9K_H$p*iU9SQdH :â12#ClEG-뚚g EuC*:Fu7ևyZF*D*SшPkD=hA#xоAPr, Y Xxg-6fV.h"j$H a,m5*銱R%P )Dl,z [ؒ[5ֈ#CGX/κp+3A/,fɅd%!dP@ƀ@ eT Ed1XέP}hk7Aku6W7MpeCG0>r !{>V\2κ7{Drc2b@o3փCqr|_)qC_A)ͯ&,9FZrLEK^z~qb(ߦ-89 Ac cT*(A8Ei@KȉhRpSjǏk7'AÒAےhLG>Ķ(I*0PaXb *rL-yڙZٹKn]w٬̛vz|Ow׫}x~L{⹜aWgwI\WCmǸݘ=tߐ짊!0u"d!жKY(sLkCYBp2:)m$Olٌ=8=EWs_. St^Ǝ3o|w͸>^}KGWM_>>QxEEyWJ2#Z1'@KQKD 2f!Wq@pf=i$xlx;YRIR)'R} d?&S :.G RmzQ%z\lޱN(#MlBْ1!+j,gSzYϨ?~ YG<$VP!j+?v'Q8%+U)Fr.h)_*=+??(T&>(&G֥0!gZ|JQaV PԊ)? v.-Mț"X) %raQDbH1A+4 ޕ6c7ROXA4.#!x/A5O K#ŲKfGLӗv=lf[7:lgQ^bf5@o;Vy#9hYWD zg#ҁpΗ9UVriC*ʵCYРb^z ~hz>(ZGV?|[/tyl{gߓ/-NJݏYS?i<6.&vdkmxnO~+!-gH.ofi"j)X"tDD 9˾ɾɾ)DBb6XJ .CL3uB;@bҶw>*sEP^WȂ " !XkPZlEΧZ蔜Om7͘3oxÛ5*O^&8@ׁ>(`{Ȋr6~=ME jsׁ!0q`ګf;GN;]%bGJ%?P0YSRtCbJ TM rUqIY]5i KE m[.IFU=eHRG%eg P/BB:(x5ɨҺ\+r8\BKa8Y!yܽTKNEڶՐ[×mPEEr ]n x*XʩUɾrjL.[oZV|9Jơ/XN YMeOOr1l&v'%B72f/,Bνb_yIݨNwkf%\kS%Z\VM*DX2ju0j{FLNe3o>^qqK߅ϓգy+)1L6]ЍJ:ڋAg+wA;l0!,Y%=G)E$jX'8*,H 2 s Gɪ7#i!^c-QO U:60%r1  h2螽i=jv{,Gm?epM*l٨<8.yA:phy_2Km;wBq(9+gM #l4@PX*!d!Kz4dٶf kkkN¡ "OR:D;ˊ/0X%9u͐ViX)3l-ތ76Qy6r#iK &q[&dr41" 㭘/ym"bI} 9XUh lVJ=r#+eȞ!X4O['A܏cPP9Ob}ǧ%^gG]TE;(%e dϱA >f:`c`WaI}$Q1c|LI"Ye)kGIG I9R[A:uOLQ_ ;kG +vHQu[Z] a ZWg7?|ſ/7Vc :?,ߚou%\'.˼/>I'1~uľt@-QC |WI[WϪgV+ Tnم5JIll ,]l2"{X G!rY8Gɭ2,~G?>7}1Qe87>FRWn7P'~5.|/G˯?1e_{q#0> "k}~Y)@c_});ܘ{al՟|Dlfn OѴԣyGs#yt^7js)x㹠z? 51o@L8/ PF>@mN.2>M<(F{ 1^;Pq*f}M+FQ[ԕ^[#Ib^lŽ%-}]u^rFARK,.2{e1xU܁$AGc"7Y,'sZ!t|>(Q;Wa*I@KQycy}n$'o-t{-s63argzZΥz # zg֟4&W{n|l< +L9 b =']l%$AGU~,R{ۻ v~h8 r='UԈY$ `@P(EQ,6p-<0#{z V3v_z}]/Ce/{֢*#SXE5Nƨ Q"F&pD%|H47AbNIVY 1 * R&XHd<3)Z5c+nѨ'~ %;FԶ5l)sE\EYrՙ@ݟ:Yܥ͐3UdLd) L:R(D=g~.E1JXeJA%閽^r\^{ A?#-_S)4M*2gCϜ?9ZKwJn1 L[{ezRz,1՜V/=2G=uANOOl  ےg$w&Q̭$7Z!£ BF;`D՚+ :b*x7rHP^ZD!5 [gnߴyzy?];4Z$«U}316S= x+F2'AŨAPe0B.bVg7|,n^w-Zlv]㐝zST:?@BR t PMFStw1ZJVDZttDLͅ_a\O 4&`0 { B&,fv ed)XBC1J58 )PId YQa::?UeHsLߟ䤱3#}C&xt;OO/ܡ]''iɑXؘU.Gfxr&,d4_g3nȽ:3TuJ?>INIA9 KdjMWdZA`^K4Z( 댹+4sͽ=9}g1=#g]oӼ4uTs!sebhIP56w Shpk4NzxWSEc5[7׽;93,+.}Ӻ̒濜 0K{bi˃ƛ 4/5&ϾoiA+nv$_RBoFܳӞ !2U5)_㭇JJrq) Z>׳r,:7ݤhMtb́dB"5T *2 :P,B-O+$jooNi=ӹ<_jn|d_pΝ2@]" 𞰞6-e2S3A~]ygs̄;<)t͇|e =:D\W="^^ƻ%2Kԓő//ІTMW<;^+Q hP!=ھ3&'֯N݉QEXtRV)R!oc4/|b. gqҚԪɛ޻}ܳ0GO~wh:5m WU=0Cs- ¢(Yz; ne20g/6 eo4Np``DA 7;~U5wtB$/$T':y@ꘁD b뜋:M&]o_4!͜R^mzTsW Rl RY0,W(TJx$s^0-sAZek)Dq#RX37Y!,Ju,a+$,\'5m VR \eqwt>vup b#xW(./pb*K);:W d* tpr*\:\e)_C"PAKƟ&NwäUyaR[pWM_=D+GpE%% \eqBi)WYJ+KyIMR7tzvޝΛ࠷`[8D[u+߹;F ̐ww5.No~">w?ةůf$(x:+zQ o(,t(>Χr3~Kj˵iU:4w1걽iZRzw;XR 25Dyoo7[fP߬c/~]l2U]VBܶ" :gA;­}Ԋ҅҃/*&e=k=odf6*%J!*s 4r!]&%*E FZ=:},w_0^=K] By>zq؛}9kZ^qk[KIOP .GC|֗߇9n3_]/Û8P饣i3Ui9Jvcv+[{N:mxl}W*] ʆka5'L&qAVBvVaHs^iuusf8NpXB:-9HF&nB0-5:& I#DslmaQ 1P18A :THY"!Mj-Hc0A1q֬-w_Y/'?_^TJw#)HV9F%58S,ÿoS Z@e2?K?h> FAiM/lĹPg31q +:_wv;p:z>AIVSƣNN]U6y/cxRM 8v R(*ϼ JaB疓\чq- \ b#lТȨN!$H^V})Nѕo  6:EJaTid,&XW)4cS,䅱wXAE3nm]$ڛ|`>*;.#J\dW1)ih4|bDp~ItHI&\ @)/%Ð=12lr}6-ft؎HL@xmJ~vaX 1wKiǦ- P`Wi܏[d`$8! '1("(-06Rp Iiʃ,eqy#FeBs90"~D<d^o:iɦ pŕ0d0)"S̀)JtwTgu~$UNmi{؎xΤj}dˎ,,Tb-t7|`!H ]܇]^nIN|; ֑:@;0 aza^q t y7eszoHUBC% OKIXL'W=6t=adY}vw/[6'<xNrT{Ѵ̵V6Ȑ1ö^eU#j7ˑQ"DŽQg֪r)!(<Τ: ,~]D\`lQ|r^_9mJ.4I$Z~h("֘9+^)/yi{vmν%X!U DqRU"#@SmBj ?4([;xI%孡I!fL($clq\F/!IL+GC{ &ΖdԔ1il^ۚ)DTv: c 68]&M)_C܄BzB V\YR֡Rڋ -b"GViW" S=h7*qI^kdmǥc11]bM1.NQJ$AP\&!MJn5NJ>? wą ʮOB[-&4bO dx oHk?v&%ؑfaj`X/1wӻ}|Q]ttt*jϡ#,A;.VCy?pG~tbÞ}bEHrQCiBIؠ2ry}Wbd m4Rަ~\8R2t>C_/%:Y-,:z68:s=$[&ҚtV*`Ms6Ù] ŢԨ(wG3ԞرG }~ph~N-Ac@9"Hǻx)K=N~|KD]Z˙\谐Vo%Z{`3e;aq/zW;:J>;/>i)sq)JXC(bg EܣHB{ef* rAԑ.+`ŀxfOZ3gPS`lJb'툢ؙqEHrكk{2S%_?vdY62OF[/טߴS)c%Z1xQyQz;&5gfwR \\ |4lnžYo0QǰFqn0C wQK1JcEOvj[X>e>d\zE'g@R VNЏ *HƆag"ËvLh46XJ:OŖ$jv%58$",09n][fe<Kƪ[:}J8TqϷwWtZ }_~r~?sj8&:6~:~w1!.cLQ,1zZ.;@fA۠E9%X%.`&ޗ?feƐlf(dO }3eW_=Cci4 '\ôK%면~ZwAi|I&s1RBq4X=C(;Z*ޏMί_ _53j8M_0A;ͮ:_駺>Ug(7E|zp,Ԙo'Xz[rȓ= V[Qݾh֬t vs%rm& dumu7]8o7\iꄵOr[ӕŕO[?]ߏ|P7\xF* wikot8k>]>!ROEaLہ2m::VIT}uT侣ky.e Jx]!I.QNUD*8:]GȨsM!TmlZ93%V9Tܸ㢲ћ*aIZMªZ|1VbuLTIBI^wc#| Y+oiw3@k0<˓#5}q2Np>>Y>Qwq%_}bF6j6vtu U1iuVxgO [NOlOo;2<d*h gCMl2@˔DD֡Mtщ1P:A -#$'GvӺ!g %lS|Dpױ_Hh~^5-|>O-M>&W'#qa y%-ZL8!b&'C čTaDZ±4)A0 1qN)8Qz68[YW?>bF@a)Q\% C^QEo5bM-Bz;*YPoaEy`u0RWU]ɑ jgZ9,j 1m0IoMX9S[onPQP0xJ:YA7fW1GE9؞8}J'oS1#dZ`RS$OPY!Q, ԋ!Qۆˈl'kIE"Ǧ|[oBAHEXjZb@8h*F0c[KUIh@xcOǠ]WL6L׫eI/y<=hr󂄱6@qЙAgCܺrgC"xvQpy:%-baˈMα.8VՆJ-cJ媑Ap+Icb dBhO84EIWGh,RTFٝkpmҞ P i`h78[13[ {>^x]2=~x{+usaj4 ]#-zNEy5|3Ƥ):H\YN3 .ZkwM2ٟWgoz5;}Fz?V΋`5k &=E'nr8۞ݘ=Y}Ϡ}EZtT ^恭jqT0֑רCb0 b, U1D]P($^ B-Y ]@185x):&=ֻqcC >;Yw&#E6" 4;&SaiZ?MG_Gt,YKǯ?H{HAbeDƎ E.D^hvqo>O[8侸N sCLPJJX13DA,8TIk499TK1rcoi)}z~\_3#%'\ yZ{tLpyqqiCp}9j5U7)jkGKj4 o4ZXi4 a ot:5rUDqvr@m2+vbrO/㳷 Z\mq}5:N ud{g DI[, -ӔC58[T'NXnwWm^|܂]nz!+6]-Q:57u~ndr? @Ĉu +HJ个lS`%XdULb])>rG3ṝ<\hzV|6c'gh%jS+;[|;씏Qe hy< SnUh~2'%4TT תo/A$!M^Aa-O؆Z??'?AJ+KhBgC6 &P4tdw6 ( !xS٢4umPg7:{,KK + P%ik!b@ڛ0gz9_)efv]bއa^<B!yt[UKbQU(w6yD{Jg c<"M" @#hÈG7TR*"4^x4^tts!ſ}.R؇ ߊ8]rAzwO.F+jwsusc U [>0Τ:'G7 d'z$jnS< fmG}pE' P?n>t8GiWs*))дJh#AI LaXE̥^y81GWQJp=d-}D}H.sh~e~"m t>?'oI-vkmz7Kz.Hq=32L xXLnz;ӱInQ~4_S|z,,-4]\.f7߬ N}f6tLr4:Z7b[Z`hӱNWEL }b2t N)x٘ʏ+a؀Fi[ͻҸ{Ez o[͏nn?V[vjԖtK-cb:kwgfWl 1yPkiن:ShdA !R%!#z~~^FB1M=$w㱐n3'ݺ]{B!袡Ǫ9)h!V&^@c֝r7j} >o{ݕ-y<=C5-u[Ja:yfL,g&&*-5r=6y7֋퉠g& )]<w5o=h 1obRzID햎٦êGn=+DD܆vųKB!p40mbq ISv [@!P*E#r'4vs6:9 P (h/@v]^g{c"DMY}mq˪7玡OsR8NzՂ_' *-ѦdRkirʋ5gz=LD!0JH>]Tʿ(ӴStK9DŽRnȐQܦ*0:2,"T!-@Z 5팜-HrMB rD&P̵-.IM4 r:̩"_l¶ biR՚؋f`L{)y\Z?w@Q)eKEiY!,5E>e(@;-PM*+!:nqD;Ĕhu.ƅRb>zXݖ{B5%ZWVo:;䓳×;H9Y˃ P͹A*-FK˝M+ $6bg$]]QW4]6O_PLGE[}nɶخI&yev V{)RW%@8[C EVmEZqt:Fv~{Ń_rt-R8K,(U$Wc¤QHqwIrHlB Oc*5*eڦ<Dk ϦkYǘ3rv4,h#W+W<>X#_9gcb5AWE-ͧG7=JlY ΢ɝRkP t6%Z=/gNsE)RxT1tჅsGmRͷ#;"E RrHd 3âҌ;$8"` (O x*<ˏՆ{sY|mfe3ikMH=S >ȓi0mָ2{PوUu8RM;U/1reh< I*1KUJ|'Ta[F'66qBPy ' ;pKK@Sd5 QT\qsl4|h%Sr̀Jp~o'?tq){g$؉`)K[^+b0$enN_P]6 Eǹs\`uLu^ ;|F!6 ]aoF}E`]T& "a :RLݱLµO @ͳJ`X ywb !ueש0*r-[ߝWE! u,>.fs]TBzZoM{:9MozGH{<ނ.nRubh %0uɏژZ5'~|=^6eQ݂kGzn^s?̒^[L OBm=I{n놴vtDY,i}*3X'0~4'zݟnהd[A:V%렜Ir9T$`|;*i$X$Ǹ߆רZF_q77ߥ?|}~y瘨󳿟x5h LpFGO& n+w a\M-f|='./T_SI'FϿR m*I&Z)+soBq+"qW]Cbt騛mCJu9~ӯB.u׾!|L/:x}~G\[u4( ?Dn#*pb%A]xP^Z'6J0g J=jN7Fzig\~[KCu$4x 2TSFGCB *vIN'g:u=.`{.A|{(ϝAP *N ۙ4^ۃxHsgKj5+s4(^U4N:=i"x$Q{ŕ "Ă'&R#dF LT7#^w< ix <~H/"9N^7}HDcNnNqv `7B8FSV=ЭwۨD 9eRpBNYōJ0EOF#גh+7) Ȩ#2I:{9l_`AA,1a' 1%d3SH" :MZ ^VoFw.þ㯳 rh?;ni3T7붸gcsDJ:w&\́r8L0+AMY[CJF/tJÃ&f#r~ξM}wͅb6NN]q4]Ln}5R¥ 5b+t@%$#Z{Q[ẃe5X1VVذwTnR 2f_ t( /ð{I/~Cb0ɕmK_ͯë^$7<MYÄl{xvz]g^ņZeB jN*͐ j*i@穗q1EREWt\w*&W(ick7UB> wߜ1 #ƕ bEJiR&){ =áDIekӜaOLOUWW~"/V'kDjuiku)Pia{N~tkC {$Q^끸o^y|XG gP?̧r4^X_490d(,B(5J:w0M|)a1}/f~uœj:o{e@\5.>X}㩏 ]j]']vFVw j⯫ZqJÏ\Hl[٧x9,f@xvlx{%){~IUWzKBKz].Xy'bYؤM p\wĻn*x5-YJX BuB/gYxNXs)  d=YoNX-8pVyyc .pQj+2?' 8V+'eiPhXu /SĔCa0dA0,08(cLsKNX2EEdmf&ʪV"r!%@"ˈ^gmm2d"VDX[4dὦV\Vg7%n wPRm ϛR ׏z1*=$^V~H>LSXt+MRF{89U Ɍ rFxHiyfk+@@C@N<5z;=Hx&j}wAؠs\N@rTjK1G ˺KWӃՐ;YbOOna 2bdw":sXY1(RZŭB6H4a%Zf7j?w ॐ~> j]X`fYk`A ˍBp76oY:9_u>Yk:{Wۯ;H sƁkQGF !.X4V>٤QlQ6Be«`]T|:b`y嘒1 yѠKQ-Tjt|fQ8ka"%FKƾ+U.I/Q&ˡ謲 YJe"%N&/K,tqI hDU"[}[&ΞadӱL+bK&kXC.2 T%)E(.K!I_Pc>+5TH̡6>p]\{CcJ>)2#e+Rt#[i@%v*F>)?h:hEJYISX )%i**4%E R(Se~߷A}lDp3aɖJvNV1ގ# ݴ>=i<] zݛ,v/37)qcyڿZ'G ')LgmQe%MWWo_-)o_v_3=ҫtq~t[) =KW3o<]}듧E+b$k [;k;ܸzog^YGSşJ(p>6C6Yhw]e!U {U|JAm{u^!.9AǛ!7mBxgK_a~K 4Zj S&8˓6:fr᲌, F{oKveԛ~n2HW8O8|&Wg5ף_Cn~{!/?&'ϓE"6t=nn'::u sl(ׅY7wשA{wkwKUvmc[ոnWYg*P; %jg34k(ϓvv?fXO^9z$u9-41#&:,#:($Ad)fa۔.} {Hrp"et^ 3or+dRYspToѶ$I#5(.K"rx3&_rC30 S{t/;V0K3" 6$FgSLNx2-$Jc+kLlmD̛4+\˩\Vӓd}^AI+qmoI:hr;Hm{J{S) Fe$gdNgMk|CpBEk"0%$UQc޳VD  ꘭ѥhrST\I9R8{UҌCPVB9`,4]\fܓ<؆wox'4n<~_;b,eJYaN\[` ]ɪ6 9jkUȞM|)=(^HQ`S*xQ u&ߎV2 nRRK݈b jҎCQ*6 =0حL\E"&dXJef,Ss8uW)OǡDq tD ^ rH(B$ 8Cp[Ƙ) 'mV˨A3>` vz$#8Z:!ʈXM=Y\$z՞pqtuJjZr(.V_.Հ.n4$&H)дW sa9!Q<З1!IǃBx..Os⡮n<l<|>shXMJimIpC3e?+qnVMߢK .44Y4Ӡ @h!ɻ>rs6|6ngJ{u Z|{Vaj-cةm),ӥͬo9٧r =x/A9N# JɘJ#-:H}\ zI^:?w'.30 835J YD6 !6p&\^\#⧽SGX~2 8N-Έ23^Ľd " oDcZ߾S V+a%wY,.WF73Ba1 gX Xeٰ>X"/ף/ϏszEW

|8E|Y`'Sķ+PVioҹ+,KB8NcJuMBJQ59Ԡ)6c'}KOidy]jNkv&ᵰ2T08!x š@"d mZaj*k 1iOu f 8` 3;&(ٙ8;qb SWZEWڷ\|Z )֮v*?^5&x1׏vKav|BuQ\]h7$"])1ˌhǜ$jZIC\6c-D*b4hYn8KfCҖ$rh-^CV7VTNw9BzaUH(^PٙgEf.6 ' Yl!+:gG;k/`f^_lST%li к|t t6J"xڝD IDǓ+U)ZY_ֹX &Х |U;'0IǜЇstlgM)BL2¬VgT+]I'DH#/ &YY( W+f2BM91 p7})ЙNXAt\"F) x'A'ABaR O{@A&B, +" :~0Z[w,73Oto/w"}3̃a(Be(>S6٦ZD0] G.nZmO66q^֧X;ycw=mRǑ!Z"~(])?bVAj6 tWD z#ҀpΗьf*+xpe:됊rݱ,b^wB}V<'SUfs=σWb|ȓMv]1rCz3<<r|Ǧ_ A*[xj_5Yf^V3lkm}@~^lGu&q˛ag ^M}ۡ yLZK&(=(bCKK̋BD%X[ KI}H""SL g"HL@I$A)Q9(D:F" 6m*u̮y>|:=!mxs˚yCœ-Ѭ[~WxpxӋH}D}dwqy{7蛏>]tQ6A}p:o! GsmAOD[{*.):!1nIT%e2'K w"+\̪,yd!I.J.2@ =R QXkFVfp]{9..soxfɄ({ W2͋m!$/ c.rFv$fg⫮P95ؒL9*S:rj,4S{Դ$o0vV5y,ҖM~_#dO0N#Noן6]M4y=ʋs|dAqB#jO>b{|wyՎS>ȷ ?O֏R>y:cY5Yl#6wǸx4.|=F@kLhL> { ZkuL}lPFl 4΂mFɮh<YoH8ޠ[`xC/ U1*:y2u_j'@>.H>:My+SS?go.Rk, 0*|~ '5r A%Q1a\}Hpi<޸N$EI9[+HF~AXɒ d%v;y>*7Wпp%f+NlI|" xpB0W9YVY,gIgߩ˓1|̫`4uo*}gX` [P"kXhKgE‡mVWz| 2JH9ӂHJJzg) "ڒ]f'4o'ƻ`o}]uxda1ۧ~g^*`QFI ~;7o4V7;٥oKg:dcdw! a 9<I%DdV%LmDE%@RY{6kQ2OERWZU^R!O*&_䎏KW^Rno&gCn}4kazu!j[e{5n={pQ79Jl|oo*u fyy'{C+^ Uxln~zgzRب~fi}Y*%豀+,ĠlƧrXnOiO*2G cOl< H΂艞&SAS9[*IfA ,MMć;۩fƁpkbbL(cFb@1Ǝ 3q7|Qݨf^t5rۍo7sٙsd9q7~s} J*>jws†/]ko#7+lI|? ݞAv'3VZd*eWY]*-u{xx/,?{&8Œ^Iu;Udh!SaBYnr wqK \kB0Z tʨ4c/_SpDH6fR,UTqG.ڵ2 Nc`vnT-;Bi^p6gaCCDk̆_p8<׮*&IT+iigFSm$gD,v.)qjkUV_t{.e1^uɕNH5!#LLX'YNmv;bQZ 1[ڻbSU[XEڝ`r3RpL`SA\X 1e)=HjWu_YvGWx G \Rc\Y6k+%?Gz92TRLa#EYH)Kl&yms6GmOe m5OIJm2v%҈X%&ɸRj"]Fco鬑DhYb8Ċ6`9uRp263ipJ(Vn`iM7޳cJblw>NYd>`^LiЭ~̆c֜rpHHa\%8 =έPߎM83>n) " p(Kgxl|oM'|tEG`r@!qtTOL0K݁nܦPoFQ% ]Gـyu{T](,Urؒ B@/Uqs27Rmli- il$=\5m?S( mpG>i3H.jv}w^o+Z72Tk՜7>\g ` i:2 (VJ$t38*+4gup1tOa~قYìPht& jAJcc4 &jrwqClv!ʕP$WX\!\B+D+i Q*ʕTJ\`exWV>D\\)ܮz VGq>f,T|;-eny" 77ES!(|oőBk)m[Jqz`J<{*08Hs`Ͻ([{ϾLKr4.f.nbcdJ(!D"t2[l*> S/n0X%,iИ({w>>]?'[􊯣(c4Lz5ӾM!H8Y"IY-EwC:VFFuFޅ4++<xYTQG"!6elOIeVV6vVP#+X msqH8"k]&L'4q4a g,H*HQEri$ i6:P ֯#J-<\C`! U3fh BhRVZ!W\Njzj5: #+KEh\!ʶmY a3$ L` ]!{WwAR\`#fr(-\\ kI:B\"W֒˕S;:RqR0m8rp &D]+TűnSŸ72euBVkM~ ZUdco̅l6$+1?]Y!{ޥmgR@a,L<.MD$d"kI&OW`M, AED8k2W њ/!JJ:C,!/~I3"WV~ Q*ʕ '6,( F.25Mo\|rI#B~fp_:l־r#q@L'WO5=exT@r_Fp%r(1+]!\\!ZM.Wtrurř&Z$W "ܗޭ-#r(9JpBEHkW1\!\\!ZE.WRNP$x$Wzk ZR;_j\vr8Y|$<zT kv=ϊY/z{Þr|<g7K>FPU6?--jF@^AbL0d9x۷ޢOn2\Q1Y2D7@t`7?GP83F2JqY Є=KEljo.?:& 6?(;JZYjSFsU3V y j;dip&$.t)Mee 'O?> D:J^̦ŏͻ;Zc,Kj=hV9^ %LO_Ft0 TpnY)U*՚U3vvY_-QJ$\BW2Jz|}]|1I2xyڇO?r_Awe:ᄄ07oN{~vzנʲtx+#,tx@eSDzzs? 0Dl UH]&f[$MKfv!4*۔I^% kBI^пCP28;ڄԲX_-`HWW"v@_9%stM7+!ln`p~-cq ,~Q}܌$FNإ~GZ]e'PGأ}lYL?? o#sܤX ;tn2\dţ#Q ijj?$3Jc\jz̆_Wi0WC7tGͽx>u#Elfɞl r'SuQ³XP"h.$عݧ8v,E2e6Z<lUy6}+0gi/c9aN wl'M+/OvcNwh!K$KC u.L6QVJovv+[?Lzuz4YL3o_' y۸~DH RJڒ\Ijq*6JB)n"*KADkibA{֜}'Nzm`Sn7n޹ao/BHgfC?f{t[Z*]yu?w|OJ^`/Smrș{E5||gi%DDpf%T!bcIeƤtLkLI HRĚtwݖ[t-51ӾKl͌chqKGHAUӬ]B665%򡆝5L ۉIxK͌\3rYRF:,hهg+%>ԂoZgӖ&!* C-Y汚SL7BtقE\˻SZlYYNLxM1^hkrӷtGg5 )9)mج1*³RBaP&FsTtTYn =Z֜|Fg5i*}4nU$ V5hC&dUsvn 4b/%QV0Q6@Ϧc ?uM-=hϪkrӷ@n Z}& \h&[ 8VD:,VdMK3,iŔyIAR '6mH8*#}z_}ߏ3#fꊏ\]cT 'HZz#_AQzs7[~3"҄H**"{N ]M%a߿﯑n5돼^G|kĚyX1T{Jq$ICO݋WQzjN7{e{Pi˭˒`&Iɢ.;SIRLȸfDmʨQxhbi΅Ms/rON醧"zU%| amM:Z޻cMP9{ 1X9a&.,mi{թi^ThߥR-g.sbg֥X4ٞu9Tr))jӿcp߮f{6kV6x9lf5hSCw/&"RTWbmFsv}L--ZikSGT_!rmTY`'asɁ^^O\-Z@a;)e#-_` 'KfꝉANʒR02AZ=LgIs ~fϭY ,lM x%_@8cS^;߭4C{Mзʛʊ\^e=J9 Ma;v?TEհZZ0F(3`YvOªGO zVR&AVok2` E${հ9ܪۘհVʹj<+(67l,'^bV+) ч MٛİDupoaiɁ'3wv~,17ŀa@U*/ᐕ )|LWlֺNK?m4WO5հ &;7䀓>0fUQKO?uoZ3E$/3y1N?dya?A #NPI@LHWTD2 Izk;yayt\;cX4`R}O?> ji)\eˊa&O+ +8D8M"L!2"qI2} 3*XbprMT|΋mN]bū+5 sM*u323Ij<̾e:YjBT(?Gt VXd\K.LU+n.7?;$t7[@^ȼSYjf%Bved},ꂡ[( )jRiCͧC`ZqtK 9L&ivQO9Phf-2~Etj`{*8a@\"_*E2O@; <u{wS+@IoR`nhc,f>n+!SQ# ( zZpPi!4Tlb!;Ylc 4t&_Af,ApK8c%eDUìPMU?$̊ yq̋2fņ=%3@@ ;5eԗ v+PM= Vʋ{J[i/V^ ) Ie*q$;_Ù8wά(9?t@-ϬpR=7%H^n'V+]~C+)+b8< f5YQr ?ۘհ rl7͖v<2[)2o6k|_K8c%e=1Ͳ2}. f >QXjlطBq_L6|?pbq2K5h8P՘\'f( 1O"&.*?:ueK`^bV )owKj%d֟Nf5UyfFañ %ϧYa/)92c@agk}% X,&yZYi,&}22c}l eW=(0T9(b_Cɂ|_R0_4֪qͧit=쐻(B/Ǖɟ4g7Aˆ`<ҊmrDg;!I PaJy,|cs@Lڒ6NZJqxϗ.,*ea2qU*͋ت5NKhZ{֗ZAv FOIf+hTaP}=^/+*c[qu;O>/ZZ_=/<"BfA|nU!ƚ.łԴܛ`N̯_ Ny,U& vTiL9'Ǘ̳lEJm,sA`v.øHϡ:zU&4\p/UgSR\jDI<>+8(NDWjq)JDhe[]> '&7/h?BϔP2Y[c cWzjSiČ-;^ܳ؉lfly'ZdC한UZu( յ 7 Vٓ<Љ[Ge(!7pÊn{N#P>k t8Si7P7x%qƲ)'E򇑤|TcrӇez(y2FQX$$n$6*~IłuF9Yi8zVmߪUG&MQ'NDںF d;^^j3;O:j Hy&pUy@& K"kۆ9ΑjAƫBqaF53ƶqT/QD ۤXR?Ή%)9Ły+Ps, ni(Ґ=jubL,j(k[jP~+e!D\uZ4-fa6 $aiIƁoߝ(V@5ų^ʡP W#tc۪ݒ9q #h0װFE }ʶ9ݸ,1.KNE_ώC*%í[fg34OoUl0*QGMt\ t;'.0,Zu w߸(BѯKGWyd_ɧI>݅*9XXY~`=A%wqvXS^h]gcb9l7=bSy. W %TO'R%&Z#|Tzb  `Ժ𬡼25w~fx/ȩ#.d;0LϿx:Y/o 4~ȕ|}LOe?qV,cz\![8N?l&]W+@Jeq0NPPK4MHc!EbrjP{dY(J22K$Tb )LTl_w-C㎗MnwPgᔓzpj'Ԃ)ߪ74{%MS-R1+Qb ( 1p28pgwr){79{`hIm=~oXRf8|&1AQ"xT 7J $/egQE,&W>u? Ӿ[qH,I  ,b=XG|3*2A < )l1v,t2n6iAa#xw8JcQf7F\B-su#Ȕ\ȼ^5657n&yH3B$Gơ CRm7bll{W#M펍Q;rwi8D;RY-g5\o_ +)Qo}7VyWAYy)-?d8MD2[Hy7b?͊ ;5 hQz;AkNF 3dO<%3ԑ{qt{?2]2Zu s'@dL*D"P!$g*X_t-٘lڜUoj74JDr8 LP Ё/#6TgpÕs' 4"4 d& lAzmRIhZծJf/g{k q>Św@jaZWU& uos2R-F=ޔv{̉^|eRkV8ΐ}{/ZY:u4Ykhy1>:Ϣ 7o5[|3p&1!8A/M0UfnL [:c%6+،쓷~ "`ir&)& Ҕ܄yrp+S`(X]kS5d4 E=Y^cwNrDQ;%\aƛ9u7CyRǢ:P/ys28VHIuP >xj 0wV5E7]TG8'b"i"Hp^=tM;*w0(lX\h#SY6I2|@0`Gw's@Q7S͚E9"YwA[.jCJ=80ˆj-ˬ3>%h¸hf}2d  pe$Gw#+SZ|7 Ty{l;Hr6'DɖE_;Cwh}J7uW'Am\>fa C#YT/Gvb@8mBڜ G>kb[ *7 QF0(IIp ɚfz: !ePG=4:|7y~׀绐D&bL7@T Vp*saLQNo vdgiXO \s-\DVը7}.LPU\}K;0X%䫓^Ȧ[ Wٻq4Wv0X i{SHr($_}EjH]{=[bADу5#fj("/˻,╅(JQ8GH.bk@W2

t›ua.Zį3 |~+j]ḞUcuX1p' y"4 OJj.`sYp0/z%9Wa? 92!X~Z8O q9,GR{N;vʃe>ʜ,;/ҤBBԝjJLrruJ:Mk)}$XflX$'vB A. <`܊-Hlcu7U _GҖb:QmbmNrqCg LH[N k\%Z9cnEG1慻#\M_ 2ښ6o^}薒d~Rd_UW-J[L5nάż Bg^  .pJ9l \k`r{:ܒ&zNͭ9($h8-RV^.$0hWQV*2d*΃,'imͶ3S4{5cRK'>q?*R*x>vL)t^5adMK7aD-(w8+h#_\c)4e)Q |'ݤ++0SD}D>I5nhX2~ʯ*]x9r}:w>23.+›}泷}Tq" Q[^1_9u?V0%@2X8sbfΆ9rV 'a4 F?gЏ8_d<'[ͼ+p 'NC!k?}Qq3>SBC7c)QLž»=Nr&;n Mv2O/m~ s_uu~O}(fĽw@RL ~0 O1Hvn6gnAb=ƪ/ e'T^] ~*`;~ IHd%=㦋s:5I?1hZI}OڠAa7~$r 7" 5?@%S\ g2?n5xX>dgY~)ouJKoVhR&߲TvI*W\Ss3  ^$`RbWIw!SL1Pɢ[Z6YۖxCUJ C(,2'YO@ :ԏn# !` sddjYb)T[,yQ0v>R8]daDͯ5z;Jg6[J^ܶhe Acmf۾#6ei{h6쒻p?g&C;B27.'.]??~>1Dz/(Qh^( / 8ALύdg+ϥsX;&i>.ļHP^R$ob'!3c8-$9QOSNB*n<2(qiT{$7Μ݀OR@qwax0.:woDOGeA gی9$$}H8K[x5x'۶ב_0j%UTi^yOUD3( Q =1蘆8RĨ-%"؜^^v"Ƀ"Ba?Z};ڼ*oiA`hdՒRIxecH %ɣecSAl*儜/>  '8[PJӔf`^<\̶!,jh1^W 99v"JQ*eӴsZBd̦'?60< qìd?|0Z}$x~zs2T>VVݛŴ1R Ly;1LWs+W$2C_gFƯ/}Ux:>lSłl6sHxIr&.Pǥ*|6ˉfov% hb:%=xUp.Q@iqv9MfG!EHHئCT|>3$KE rk)H "ԘZ݇xY.;hK1W1lXFrdD2j3_vAg{O>d8H=tB9Ӱb\7Vrԋ TLrڀ$7P$] 6LF¦m?'j7 w!eut䮟E$g"4uI\'&(gEB5VYb@q,+W9NGM3W˵1՝Xz]I@^2w dt4qeW.h$ߦ4[ќ2_+"D(QBqnZ>%e"aWinS~z;Ԩ4EHqV4GjXc]ߔZeN{_HVZтFmNrm70q5"MZ(rMSwq5чiT"ʅU^4I3/YOD+Rh+ rcGsľ\cRsfaN'y C9ȯVJ̎&G8 >w8OApm*Tr"Qfg&oC'/ZQ(l?Y.B$h}cƤHyӌ^i$0> 瞧$,|Ny^|bLjD*O oeG@K*.yq]H@ ϮozdۨVqu)9/擯 EřViZՎ)@!VmU73g]yU{BnADznZ,?,>ht? q34 ҅=1󐥨k @!1b:i#Tka](YbXeLd3:l~ފ+a$ ̱yae@i4 9KĨWʏ *&C"!b.=+_;Fvϕ/"S3 =̑q& |"FVBMHeF0~Bly1_^JHUk룙amyb=a QN7>ezhz2]1)̅2vG0Z="@y kGD`Ec0zϫB5˿'Day_Vޞ~3"*"*;8Dop5Ƣ >~o%:Gӄ>:} @0,l!y/7n:4‘P8*D#7͌ìp4oo~'8== @ Az3N$]b&='ñ'S0}P+.rDV/.{_AapK!N0s&8L7*}UsF0o "\% rc8-1"E\pV qcBbtmT|&yq5?)9k`*81ĺZqxKS9ΛPl z*5)^F톐qK:,mK-ċ/RѕW u}B,Ce<8UN{"IS]HTi $y^\P!-؊JLTBe@-Sn.4 sޅ25ԑ0c4=Tݶ޸9kⵀd,!J,ʁob@9cUMŨ1HfȜji`4 gAC>rJI7t73g4,_a.E3 PU A@Ls3?hJoJ/H0fҖuY]?yL"6& =&9I1@y݈N02T PnXcoWre-jisC\!Q] -PfGx (WڜciDѣԨаƾ +3M`[z2wwWHbxWW@2covF RqRy/0bq>"o-Tb&7#[K:غIi-6N(rz[xG1]1YU;+CDFĬC-J)4DZȯ;=*qpӰZY͖(RB$fkVf=3DXlIqD Sw( mjg4 ä ¹迄&!>;MΏjr=A-U<k:a ·rb6_ `~\Eԑ/uKybGG@()b)J&H()D)ΩdPuj d&IV55>>-b4ӆm5$$- }V%r@a}`׃u5ï^TQu?[[!O-z~ Ї8(p۞m̐G$+;)ϢnьEAQز0Ï\K5&~AWXӶ{z c}WLT#vm5hWl g"L锲?F}sƟl.GWid2ʑw6^\ezoF?g_҉ugIBsp!+X{5N[pI#a8 5/ k-lqiGot+UH|>0I?\t?gQzS0A]! c]Dÿ JP&s{ܣC:Bļm׿la\U|pc]ڤC:i_njکfsJ~Yf[IT/d1<k.z\Y-Z _,^S%OƎBJC%7 a*SpEZ]7a {H^"+HW7 6`hBoٚC=^ۇbR :oBnR- ,w#:8_D ߾Zj~:l@r ۟0 *dy 12 gùf7#IQ7bO hG%Wژ2V8g8 4JfyaE$(CatYhCP^B{IG)<J;ab'n|=iTwpF/:01}85(;B95j)x hhQR RbW;5@|ʼn[܆Dr3E64%2b UPJPلL-\rtt靪 !.N?NC:0b5k)ٲRAL82O|0bE8Tz f9U?WOjlwc4zEǓgSR^w4M?sd'OHTgtϒtFE]Ak: dn4H1+Au M\(yN@Sf%ȥs+$lÜBvHĮWVB)cm$_ :EϐM4 pBihT1$-|ᳬK{!Zl|<, ?>r.EAo#OZ$ 7lq}'eM<聡iݤOo0CMn{|ީw_o'@o,:=)b*'F/~ ߪ*2`52̾D "Ltc4+u!3$9|[ dSZL:HppIN6_ha>g">g!恵K@+J1Ҕ}_ilQ;ȍK*՝V˜Z%Uk=ʂd`ls{"vK}AH]XA\17WiSNO_nMl$P-z~U*!e͡iEe˛L v WxN: cC3ŶM '|FT>eXT -~Os¨#9,0\ 鐶!E2V8Q` Vӳ? Ah2t4Cr1)Tr=ԛ&dTOxl \ tQ nP:Q"Ab)¬ƨ3ٸcNWXMDzB@XNC9wH5 2sDkMг%Xy_ӴPwR61iRH9:dljiI^B ys-(E3d,c(g [('Ŏ WIXA%9 %4}|)w ܢuU54,lrg`(` &rQe"031lʌ(B$Mɜ(21Jf(vŚU Fx ڡ::3Ha}ڋ%xf+E2̬jNse VlSt!Ϩd؆FYA`s{"RZ97v654pX~oƏz]8;MAhˀ`Ux,t)$v,^i))F-- L/bvZȎt˨JOXnntuR,'rPȥ%xl q Uփbj]$a!Y8v.DC sb`UHB@ $d^L1]  $q4&R͔m(} XE3pOc8)0\Dr {3iӰy!)_Kkp0會~ ƹ1 :wn"c^Q;Ge 24sU&Au|o§ML)O*⽪" )_r g>uA(lTa>0SB4>/Men$m2;X'FSҁ.bhNY48N\&a6@ovX_~YAK藑rɘ)2?@H(mN=׬l|9L#Mͧ|MPX7V2ȁuN9\,#KBO̶]? @lbQWD*$_{QClrfs|r^zr+UsNoMݣ6sjy׏ZՈ 􂱼5†@29H#жŒ|eͶq3.@ݪPۼ`ZxWs"zoZ|r5O/>Ch$Ҁp'8H٭MT*ˣ ߯'/ AZ8ӓ: uu@YN0I2_uKN\~$5hla2`'tp*Lj7mȨȺK~3Ʌr8\$|&HXNY.Jk@UyUlY3*p8Q *PdS !8kǥs\b嗲2dh}[-4>[1@#OytӴӭe_:wAx_.v+Ejf]Tx;+UӚkYAiAX&ɚ|<|j3^K3۩uVٽ\&σ*k6tjwzԶ2Im,I1 | yF0cuMu** ɺdpqcђ${=ዏ4m hg1,KG2*߿3kЏUH";*l[1OUҍ]ԉj^:S/Ӟ6(bm⣕jAqǁsĹru\R#t!ilv--q+d 8z-X/;3䣚גY;Ws)hTbDj@M%Pú w2nS](AHw 齟Ɣ)QWe 7miYH pxreRe`N}wؕ6V$9bx*ix(.X7&fN}\;f`TQ$~ =&o&?&ƻI]cHpXw6`%`.z|7ywskui5W5WˇbgX|į4vL}ux7V73?&Oar i2o ï?m]>ߪaLe}# TAˣmnM4?*cgim0aUJ'w]8ldzIb=P4,-pwG#'j/R!:GiC-˰T<_O~d9Y\OO|?*׋pA4SZ TOxMžw { Pꭑgխ5(EƉzjunѨk6 jۆ)zZwGSI|:;}HV̈mNg˕\/T\\^$;QB'y|Y(/v)TP[ voY'c "0yJ| -uH2R4 W`}?#q2[SscZ+Kbq1K:ѴsteCvqN4#I(],кeT4;Dދ}v(&Xi 4xAUfޏ6 _ُpq# ŋKfo~ fqe9~ f~*ajv cIK*dVh֜Տ;iI;DvM1d5ju1?,?V{2{ZTO8 Jts_Pṡ0ǫӖ{a4O*q\N#qu#p_pʡ]$z`]edJj#3s1CGCx 8J #+uit / q%Q($/SX{$.r%Z/TZaqx+v7ܷxf}SdT5BN+n"g JH(H%[ݻh +yeϬzfj#5&xarQaf"aeji3iGǬ#6 -$64z2ZTCɻrOr})CJit$".[Ĭ n gJ` NL[)C,xk{{wsYB~5]wkފȼׂ- Z:bi#)gBO" X Taa!+5 o ^ +G7 4Z1%i91Ǟj-9ds+K ǽ92)Ove"7uɠevcc+gaўYHפOI__PkSRQ+G%chON6镃VrQ'$oizR+$]귬![c5j{H AA*ѦrVk?8Em~ϯ'?w'k.Stk˜g_nWﳟ>;sV /װ` + UP:Oa,~7H- Dp,10$)e\"ɂYgL{l1?BvKg5_nI~fG"ӗT#YcmS,.4,\(8L@XZ\_-cgmI,"_hY" FOvMáK$( QR 15"N".A+5*= nkBY)[үz %ʑ D^32uH4ZZǃS77_mGnTC.#ٙVn#S;7vۘXڂ]hDb<:W*m3 M"!䱞jNsnpe &nv@RUtevKC@ ^qRT=)^`8VpK'2ϖv'#W$x #,e*IS!Ǐ e+7\ qZ[4A@c{4Ni䡉I9k#brq1 [YP bITD$lFJIFer}q*WqNңچӯxxK˳t}Y?k42ev3/·0ozaPf2!Zq2eX:yk]E*>/ȵ.rs2Q Ab]RxN ɐC.< KRǴB*,>5w4;,Q̈&H#.JB?;4+OLr:B.K%-^{8{X 5$#>&**A TɵD9UP6Fxr8VvB/܇PbVt_{P "(S'bND히qh5B^Ǭ 3% r2!6ر56>:Rd9/nۄt 2]0IHC [btn#c/P1 B$(˗9 +Ϛhr:$Œw5ހ(M"/ @q~R}c߬1gD,QB/Wb_Fc Bcop% |&:+BjMt $L!4"T/{xrFi@Bc)4j^qe$وD*lH4dq"%( 6ImƁXn,8H8+׸p '&)0-&Aq!Cb,Z4JӨ}0![,b\'t/h[ЦXfRo-‘*dSôPa)ߞ?Rq#.q/AOw6Q w@3QԖM}{e(WW5ʁVpA1D5;ĂdEq^/C!b{[,s #~[!.@%=y-c1s!ηRL|AX>taE8,.1DוiHCJQt w6&)8$zM~R+`TPJ9q>H Dc2 7!WBc&cjI Xh ǽ&h<FP#X4ƨ|>[h t$!MΐI\3C~Ϻ_=Rn@2xSXx(}P?{֑sW<  -[5"{>_SLI}h؉%b|hBinz#K -# vQ%L`a%?y #~G`v&i P`eQnice_k S|lF[TMm&'QX¥dzIJuP`0B*ȶ8uVk+H5m%[ѻL^fA\ H@"c(摔B=5ҴdHNe*QZ9ɞT B &~v{xcuK2=>vj zV&E 9+mvd{FMJPhMСe?- ~Lpr.Lփd[u v$"0ΜCҕ9)Ie7fH ?\Ath}tdPCP wwgvYٞ}mqB Q66^<}43󨒸kxZqasxqn. I{眈# ~[4IqH&AU(.L&Q=`4;5#\Br|c`*^$=Mi sA2gs<*rf_w8;EM/חR͘{Nم`և M{)Y2=灹J>{Ilu^8ֳUbVq^{JOßތÄzaoDxט72:n|q뾳Jo명z%}svfh۰1)kxS}Q&J~]<ڽ+L{ؐ-;ӏ7|5c< [^ҭ +~-2c zş6Ѝi|f;Gy'4'{L5c<▀ زHsN/=tQ@l{tյe!oĻ-~XA4EoI}1֌+;qpn=K?厪7f[h}[x?ɨkxbuR6if]9z_M2Kƒ,PS l߂5o2Sl>ݚ1^;S?҅Fkx@%x}c~wB̾]Ië tʘkx?\3>{/רkxč;sdNJon3M6[:aocJV"<߆g9_\)Ծ>a۠Q-֌pZ(lFݹ d j%ҝ@/̈Z/.޷vH`F+bDW;;R׵?Z>"ɓ͇?܈o>X?l4[w0çOnQ׌q{$o(UK7yfD;~IogN[.ڑ7K ŅTWBqh6dB7{)^FR׋XFa")fLʊ1oC7!ɥ7LВ6w(1饶yp?[/dtV  L>;:98\(:m] g_.ӗGIYo났iF,~ϛl-G.JW~<Ώڸ:j}_/?~3sc~~x O^rp4f}te3˿sg\4Q;.O:x.W,άH .hc }_:3/y3EU@.Sír_ # JPVW2t_=U͓vn><+9C+2Y~?|E`i)|ay_mXl:m2ht^ЁmXXKmhWTiͫg3g2W%{I.sޜ|\3'jaal\{a!*mcɵ15[%Xt):/l85O]Umhsv(r9tdc3~xwv( 6agؖ|M6k-|T`Hl%1FqrZPěU\p'5H5]RNsiKɨfH'9&Vfw6eFW ::*#c:r51bGa:S]6X IԤE5. ?qc+WJ3Y_v([\Dwʽ'&x:V E12.^[%W"N\YJ ViW(~WC [@[Ͳ QJf`q @ծwpFLYvҚNKrw2TѰJ u:AdVq""уcZ T4V 3LLt#N_F&P֗McawəUvJό΅1^_V]#2ci]^OXkj}˷<}z@aEM_3Y8ž2Z.R .3CWa|eO}~S6]gajϔ$lD/5wrY C*J:䬙5+t'ӮݪD3@# lW^>3]=3$| Y|jr93<]~L/O=zX~>;d.{|LSX/Ae*TX1VӰ|vnB9bSi5$S*RUSYzFro}6[_8? q?GSl;Ћ-|~Ƕ*u1Z'5_X/kknF׭c7. OǩT W[,)8۠(C$(GN*<<}ë==LvLS%o@b5wFj '?&rӡpyI|fpDyCX(&Bs`5 حJ9#TzWB(3OOnBv4sG {l~9s_춇ܵ7ڻaw<{Y(=RmQZ#}/E 9NȐ/Ef䥋6Ӽ;pYju;rWwMWZˋ-UW,RK_"5 F:LN*TVX^߼}1_ʕ y^=Z7] p Mhڒ)1%Zร7S;ǵqGֶCOCEF,pI>mG.R%A;T-"F9y+=A = 485?)(dov7#&e!;'J (GqDQ@^>"gKƆlR'*[rjO6W\Dk]rFH(j3fm6AEIG-/!׭/ 'rPqj6t vv5VJQb@T+26'R2ʭRmȆc>_0Nn9Ċee`k*Cbl˳م%*;EV$gr3MhDיVH{`&h.N)$ϑܼȱfG^V:hKd&PU*ըk 6=DuXyopܪuevyhGm|9= 'jXֽ#;E("jۡu=&X`J E.=|L[6Ĉ3-!G -L݃VYxOhƬsQZmq޴|餬X}=Islf惻yO]$\ ܩ OCW5dZ?NSr3fXU&'˱rr7gӢOV:CDy%9 Nlycb[[kE$9z]P|k'MP(c{[+D֏)6fi{٨ F0X+o %Nbb9\Ohm)@-E۝Ȧ^ivSsQ NJ۞)Rh;\1ڃLhwlx)5&p,PylC g*:%딕a8ILٛ,x AhTڅ.=VJXCaíg@Xc,H *:SM2{YkVeJ+ ca!E+WW5 G8|Z}\Oc)E,hb]?\ i?}@%7~cop:]Zv$8+ @\y Vc{)3a(4ѝӷEm AZ`|QkMQBy>$K׈y{[t)l,a$ThthwH?cW^|th'@קb4^Ԉ^J4ƦAR/v z^%]N=34y>?zmq3+ն"k4 ;@󖌛txkR١#-4٥CnLٲvgcl;~8GTj{BphA_v;FV& >|)Qf r{3!7ߩu^;,(G{|)'?ikfx~ilmELv-Ae W(HLEmrL,=b̴(9M!V%:S)hWLl!%"xlUQwZm90\=mL6 _:.ZR cH6K %*g7Uni)@b .,x~T+kmM{ـ^;oq$LJ߻F=!7ɭĈfS Zu9;'ioordvpf{;&v= Vu)3{;2y➼vԎQOWM@;jmzvbNYp%o9zY60A ڍ7n)~v?8ck{QNˊiGg-|`v. %f"m4Qv:q_}K۵^dxi3Rw^|6/>o^)|v5/5u[Džw]$dڕOZWm(Ŗ^FjQL༛ o(:bYagjy/t޻o_뙼B0sSBY~W#=m̅sMpuA%htQKl/ OV5g\k#6!k&oC]v즻MՠтQ$:R<|u:~xݳ}jJ7%olRMc'y@6E -_#7Z8-ړP5aR `j˞))1N= KbE;UF)RSJ8h1-#WXC(>3m$܃lknxUZ_oh~aٯ {=a=nF}èw~|zv2NQ#Ŗ \ l]2)*Yow[풮|^mLbؘLq*kD]i4f;ĖYHޑ|Jۜ -'lh9arNN;zhkGk S֏x%J~/}Kxd͢Ux 1~p~|&}rvr}NO -x_dN 7$T?/򪡞p=mkâ О{f~sǔoiR3uȰ6@/O]syP~h %JVF;xy G9bs7*qAmۿܷ~eGkeG:<+/nb9t>_U,nMi_)yz{9QTRSn`sCNbn0_6lLhtݰFԠx2~"WFTP0kNu+qfM?=6XsMͼ>amu>d\s/ AHVgC:  M(h2"eȾR5p PJ=.$\*Т /VɊ H +V{~@ B}߬ҳ*cs! '<*;du"c}-U)*T7k,(AFJT ѻV#:@tX<Wi $N RZ]TлV9hYUHLFsmc)AC顱nIK r \T:8MehUIa.̱RtVBP,&NQ@BHЁ 5@cyB0NYO=`QMU* gZ㸑_QL&f={ذCGMiEʏ?&)6n)KdFW/DC0jƲ,S= <͘]s#^nĬuEq6)| .."ǂD-7XS(6D:92)Ӂ*H(!d]9G}uzf7h68GCx8߻qFD(vUEɏbzȏ;^yoZAjczG1@8OJUA ΨF:1 ,f6P &&Pǜ!Rt$xZesbFWߠp!RPP+zE -8So'=}AٸoO}ŧ{q8c"VgG^:"tP_d'0XB`% ѓ3ɇ]5FfVs1f2 l0r/hg63>QO{=B.ԮRfch|vxWݗlC 8M1ѩSãE(Oz2Ӊ&zz^"}&0FЍwmRs[n5}e:0A\ྔ-pBJ%:BJ@X |vDܱڑZa{YLj,r#~d$TRDz2p#k.Y15h'{Rw'b1:!L]$CŮ9bA0H`gO]*caTd.JtrR .aKf(F\ G{5 BC6]&jPuc@ ]}z(kgg6vΗSq'/^]k5mvI TڐpIwmـ'> u`k _lTB'$.V۔x?R#X8%TXnu#w+%(ڌЭO0 đr#ż&[xoYB)xSS"˻TrYiR8Uƙ?!YK2v6ޢǒ/?q=H.>:HPlsd >G}c 3ƒƺ_zM((T1%j@2I5ΊPm` f=&9dz!D;X:P89bf9u|5UQ eP~\Pg /$H R6c`ZJ:o"!//_dK(ZBF [Bavy[B__KtϨ_ݣR lt] pomSo3 {Fn|sfpl_Z\8K;Fg5*êʘwm /X5O fXqG2X?/lg9٨3vQ~ Uf6{H=zxzpza< !\?/H]|u.My<8f<=ٳv'?ӹ9M1rΩg۝U z7@pSr$-DyCɐ-1raIcC xpBxT 6UXp',Rwt=1Ƽ*uaqh7ǵðΏhUX vH1HBXBS;]F^ ;(k5y'kYMM&7ܶ/'^ SAwMyK~%u{}噱MAߙ8HLff!Q*TMtID J /tK6=B!cziWmbdZNg@d>oVT1ӭޟߪUϽJf(بBjw1qMW[cHꔨR2C(;HW7a0vOEgvdJ\+L&`*ftY1=Gs:`(r76\!U=U1\! M- |uTT[O ̫9[O=.jG0&_swզf}#ۥvJŒW;np") "V饣L7EB}l.^T`=g,ZJ~[\&MP-D (њr lr- :CgS wWW_}P::t0Pm)BtOA8 :!4)bjω{t.+:]t~"(l !tY2vwʒύ2))7),#SB5\,vu-(tOI.NfBߘsfZvUAAlAWvl1GŬF@E[2Zc=D"0%Iv4kbfDh "G@:QWbU|mF>\&OΏm5jZׂJqA3TK]JvKhZz΃9@Lj^Uz5EQR7R*Kh斻8e0pCZlbqHRN$`UͶDdG, SݍS}}.o߼~nz JYW*Π Qc5  pqH嶹yM#HU~N#"g@.n娂1W3O%yp!l>Tbf YH#*uҡG(,r\IL!XY}ު;1W)ㄅB% L]]u$oW(8J.KcRwQE-OȼEN?!'P)ڡ hd.ÙH'":]]N԰X[,LwTtnc!G"2HE @nIW HP% }uLh=T.ގ_TB,_~z"£UFC9If2V? plw @B;PZQCHR˕([O*iĝd!xfVDqdD*:cJx>aV(%? nTc(g@$Ր1Nvj+@ޟS"K9iQډ];m8Oc7yZT҉:?41%zQχNڛ'Ndw|7 8ۼ&|Gbݣ[nVO}#nֶcgG-8^|1%b{=~t`V*>y3Ckפ\S}c:ǦPƒʡ]i!i8~E=%ch畈Ls- wCy˄`eު)Q}Ḷ|ь;oQq>:=J.iM+-T< )/LȉNQHk7ia.gFTsիP|ì[?k$L<ģ9烯ø{ (^ 90U"Iop289) cM2FGAfWCS 7B& /rÜQ̊IIV@/==_' nF<.] _㢬dMKz猨1+f2dJ,_獈b82v~y ׏?Yh\1U?|B˯ᥙMUE;Ϊ1/3.Hz4j#nv @,~F,kGLU8srڊDA\~$g1MGv@_"(enjY 9) 6VV+R- 7QƠ\۝83RPwQXz~! )l5h(3N4c>1PJv sJPkv'0F'=DFc{ndr44&(͍߹O jKd?'rF$ V72a⪭Ć m5xM( a8W@xob[rÜ,e /-z 9%!o;H xZy-(e*Fo)[O?|z+jX5fkw7n jm QLǣU.p:J6u^ zTBvju<-oH #SxЂ}[ؼ^ˠ+AB88f402\hMYX -$}]9jO]Fù9Qp5.#牰%GOx* 飨˜Hov A:(s.˲oUȽ H@qvp֖wJRL՘#Qn?:W%Ðjaz:L49/Bcm't#-.#I\BHg$J4id; rD{5q(WD⣝`K|ԔGMQ|@)>vlѱ8k~]D{[^&Y,xz n￲fCI2Ûk9y< ~Xq3'~ v;v~ؓ8]`F7U3/ {2B}nZt>NVǚU36L%jМx9r/0&߿}fO͓(>SL2^w?H)YN.Ayۯ޼ڴ0n>UCrY/~wִ_JjT%Z ~"Uk F n _>\I勒n] @68H:/ge}_?l5̄b5͖;wqqo}IC8l<6oo02*Hhx3ZvĊ7!l6e%hm7ahY'Oy E-k zu^iI%\-")J+.:S$v Vj.s-%}9;vBN83Bt%a'Dq `5ВҭIO7GOzR3@C\P"/.0J̡xE^PF=WZ;[‰#U)S@@p9=8g/ɭ.4IeeaV.PQlБ`w]wXa'Ol)IIx:w%\ރvATG.-aEciB;&\F'J:_A!48?Xizޒ?ގN'upبtlIf8\p3֫!ai^9Ƣu%q(^|w 3\LX>R7AHN<{|-7B}|' ' *C3cʣ 3Jǐ% DqcRBD OO$j:).{,w{糐|S '_)W4պgk}OWr8♕srp#Xb頄㼻t"MSfBC8A݄t.OhYZ;mNmf7лiKF}=^{WdR)SpԲ= MC+YϲE-E+eޢ2ƫFK69qC& $ 48Y(]H(]|:XGjf)(a/-; wMdXCDC prSVX SQMK@{K~ 0޺)0_Q@4`T[h{&˽Y؊A6<3RG2J~,]2mdT "2Ҕ?Ks}0r|Rw7*6X Z\ގ1 DbեRG]]gۼc1n`TTL d䌄A#ESo(gCD1%hDarIO,Tkv[Cs=Q_*#'{k"Px<@U͢8R 7$Cs)9"HKm䔱gXܝ o׫E SHJ>}Za=Uo0O.n\ڋP {G  :tq +8ݖPf@ {:˯ϬLx6\U mZ3z~2#fA>^sw9?QO JǺ \S3x> h+HœNJUVlY͜t6mcɿ҇{9*}:N*U޵˱ڊ]f 1e%߷R5 xRAOwO1!/7;t)©:ӈAYG"^'r#0.[)E`ڈi hE:qRha9aVUBF^I=D T`3߃I0D&Es8XAbn;ڔ&gjHŎhcC5|}|fOeiw\ ``Be*pR*`rOąֻ=dEs,K?@rc${X##Tu9l>BhWDR-'yM&G)LԊhV]t=BOiIhHjJ)i;VM~ I͍AIJja$X6Q˶[ޟcq2l,ߓA>p$%Bzx[2@k# lyً-Y 9>5GϽ9zsoqv)YOC4N G(fcaa8uZr-$XTBu_T8{ݗoZSArjl<4֌It'[2Bc '=^ փzX2@&"$R%PKm>,!j72CS!H"eb6I-v6T&q1;,թ Ŵ5&\ kT9] u Xy* pkAn+ nx >|HAN|+Evx y^".1`^H##s1]ց!~1Ь36摉(N|8Ɔ9bg-᭼i >tӺ뉣hyǃ't!Yo?0(Lvw6IU%:$*P# ]<68TRcq=W@q?Pj]gbp!즆b 6Ō=4Vc˚`4s|IVG$,bȈHK#P4@&"H*IKS ЏqPAxHg}D4! ?.V0]_ v@c[~WARl\-nːs!apjcCE@R Ob2ވDZePR8.$ұ:2$ofԷE:}cH*D[ r !&$8m+PwymxnBPwoȬEӇ»2$$IHÌ[DAlZX tJb'1,E9:"{ oO;`ttǸ}'0Ȥ=s4G )L'}J -ayt PͮcyY!ʎ44 uP> uP0BsjQSidVmRk""(%D*DbI!AT15_T9u͗oʌI6;[/_ʴ**!>g7ͧ^ o/ hp$uXݵ).4mwipƑ!yqb#p1 ̂H7"RȐ tMN)so BغSc$G Q( Etk?U bXJ8?JIPˇe,9':a:.s:.˫``*"`gθ4Fb;0$Dl0(A/`% Λm̲l)T뢹TC"{a!qFa].Cv@p\q Ŝkb-`X [XˉDr[ir8KS8A3jTBAg-&qI%d;"O) n,c11hj8q +1D%V 愠 Ika,JM P;;QD)ucW|<2>rV~|*י͍ {Ux6\Pxj:B~A^2tC|vz 0K"Q4޽}~+ɢ~}킵zv(/W~k`7W>\'bӍMXғy*|)W朠eׇ\B/,$a*JWcƒq;\wl> :kL;t*wLlVt'9pZ7 1^Ÿ{ %yY/LY\ﻇq ()=?g’ot F0}{iN+ ܷڂ#{r9¸wLch>#(RV{G6?avu)z&ѬȽ}6:3e:5\01tIA`7S6柛/xݽ翬i㷎t:4cwC=dXW$^H B|~nPq^pp}D`n[o|g8*#k$ʭ0^eyEVQM4!|{~O?qi3㫬ԠļԹi;'Z(P1Qr}yQﳗ;}Jl V H#o)i:[j]>!4rrĠvet?ӑ3uNW77^vl"ԪuLw|sq' yϟ=ʼ4_ ⁽=dY ы(5+V{x'mD3\FʤXCꠤt9;Y97.=;Pd٘jZ*1\1N;<³A׵&GQ&$u`Wj͉m\* lN\h.:xtZ -LT]|Zb2 }M*.j| ;N+RXʵz,5Tk2/jP`/Kd[1F'6$[@IlWx!=vOqf| a>N ¾輻ԐnQ=Af0 纂Y@'tFi\uXpmG. 0p#Z2"b1q.f^!; H&idijAJPw9R Sޖc4vIl%Y0Oe+\9 VJG4쟥@E|9NAmPߖP;,ӭl.|{\cI,hTha$n2[Nt4\Jgo0IW?zyz< );VӲE|e*T96< d.0)V{`]|* ^LeMʇ)AuaG< xߊ7[f/*1;B$Z K^y:̅I'SboVVi_&V5  & %]6 @KBT; `#EJOsʶ7A[ʳ-ON.Ѵ:jSU[Yc~}<8kMsvz5;}_ ܿtF׃0C5\?4# t;mY{x:h҅FiTog/_~YuA^8[NÑ?d熊; Ԙb_,A>ƳnwyYb]pW>o0ɿBi9+N!E٨7};c poo Dï | !?K`XA_M~;x9^s1="5_ܗQ昬e:dIp?9/ĝSIS i/s^eƌ`(Xds檶д5;)7q4S-E- i#Dǧ@Or~1 yz^fNe{_Védv*3qH'pLz$Kt,ls_H̰b,GqaJ᮰*Znx 4y$uצ6k~$+6I y]KTԒKۜC&wF򞧔h"A VYۯhV9ԝ!AhJ8S0I *&"a8DKcK5#]¹A ф,ԐT"!TA{g49h.&T\يm 1icXԲTԍ=V'@21wƷwP9 gx bQ"IoS)Xĸ+гhҋlMCe!x%)yŨTг|F.xEp?$*~MzF17x5Ũ]&Я_>Y7J5:TBI1/rJ, UMXvs Rjv1Ͻ-Y/Chcѓ-Ylɖl=lɟ޿lɖl^Y%X27 ޠwAeU*3% '[Җ\0Ͻ-Y/hcK;T`4`%'[r[3^k%'[r%{/'c쮸$YiXlXcA3)gfKZe4lޥ҃6YhװPN^îc+cR9zhOQg<{Tr%dLY~ϩ7E } l\s_Y hIl^v*8S1Z [krˈߌWlm>je=4쇖rQցihgS*Ĉf*Ց~SU)Nl@y5T Lay[-SX޶>4L ǂp{g&uV^Yp8pJ2vQt9ONY&wDξ]+VǫrbB♳wЅ=7(3;jPH(\hQEChf|?e%&u gstlYlC@̓I.U ^ )x V{ŊF5(e>(v, 9g3"gi;φ:|i\)d3L ]sFܢuIT0oRXܙ0M7眕|1 &r :&d)g4-h DFrD d{ 9햴CF s! sV*\B.E-Q Gҧ}Y PrM~FH2Ŝn@H7sr*{eĀ$$NմOD-%ע|Zd"ʑюЪ˗?uP!]9~ʾ-Q]_ތW_Egu,~\z[tB5ׯnZ.E"7͗.Xv|q}w$_>![S77m%Y?/ȱ8Ym _/]'%,~w??d;Vh}₮/ ܶ6"PFsK$u|U};S!2ṇ=(TY8X%&{h]SF̜ و^z҈қrOФFq3xP89 Jf<4!mwt M:́,022YV P3u]K+54ߤ~Nw;!6`Ւ+ST@^nՖ(' z>cDK hsfI#$3X!7ġQgb=˹P;.ʵye'ܰ.ˑg/Γ+?^>\][wՅE]jSdqzUWgw *[ ͥҠ)IڎiλV/ЎY߃6bW"k6ld*JӖ!iR%/IяRKm_\`F QAմm*9xN{l-IV`ݞvGsLB 9@tkCP´muE":[v8z=ԜLF)B\='#[)RwD(YjLKݝs#&ӈPmQ#N4vs^?Ii^]G+X 힚4TI摝xGnB)$5ރ'ČﲞEV.zѕz.e՜ZaY݃g@.-Vžw{" ȯ8cU/\2)qk#:#SH7fn7[O0Gфs{ꁞ@ޟDEiNe\[fv*+(K!Aw@mJvZ% הtԛb@VYVg}Bnkp[ѯoZn2Lo@ͫ5BC،ؿc: *%dr`/29~\ER . RRJ6DTR nCqfZxrė @c\)8j[U,reqI¦>cZj[pXfl88|=cߌHhah *Yl|eioo9&]WLz),ϰ%ywqU ҙחܹv-[nO][rRo5ɟ쇓w77Ѭ@)'xg7r~.bUCn! _`@o!ҴNdF"Ć KRi#I&p%)6w~d;$i˜};jxj +Y֝dw&-^l>\GF bOVsJ(ca4`3dA'r-knjw] jjej)?yu~^2WvG<h=c'-d%-IB`fz+hi8qi̵[&=$j JAeb ޛAS̀sGfHh$%0Bpz`]2( Os-(뢝i^ $rW5H>n @T x[F#h$ B6tAY &NJe@K{)X>E_d&C&ASg9Rz6},̜Fmԑ7)XHOJ2=FҞV'̋s{`TD-5s8/uJ)%1.fK2#(8R$v ΁,?(O$Rbs7]kr+~Hb`'1,D}ҵ\CDܐ\9{\r%3ޝ]]]}jL@|YnT{2& 'v7scU []Vu@X?T Vbbb#0$uq2 J&~- I)U#8 u8 2r>\¡#ݒ|ʻ$Q 5PpkRˊ ^b{ '[e  u*a&@"+& 0rea AК̮zi.zWWc5HN|\Zx t\R A%VFbOW53?HI1E ĬlN7!z$-X=ȋύe[>T<&4 x'ZYi@bԞJIg517mdWx}6]u=+MR4(QɎ#׉eUdPwY/yvcIcuRn* wyHSUK% #s.OMsُQ'׏òhz2&QHs$DI~C'VppUT_4!Xpqc@I$[NYT;)TjHwiv> $890w/vWؤ#wJo83_d~.{ n|\[b7+3ldo"V@Bb' փF߿ R\NfꄤXy +!A$D;= t='vU`9rf%B<˺MPxg)B"t~Grv(=FtU#'\䍔>vbG6,lXT-NUGІD* 3P85ʭOӨ?vY#ۛRB 'X+II)DŽI #<~Z;w?ޭ[2FqE@fIԠT> Z9ԾRѾ_XJlw|~1NNj"8"/z> %ԙ4x$4R;"!Nx,,aEnH\ %,S'5x5U!;"}qd RQ4=Z5MۇO>ւJm:ZX ek JxJQi=ҙ/+nyC 'lEB^,q5ߊoyy X.&W tǧ?mF:s&:_`Sy.* Hjl(g% cfw(XbbR)DF'EId>OF?,tۿZh]oP\W0qgUh>ʍ? NrP.BooÿܚQǑ +b9U۫W`(ry0)|4^ITyfK99;{zT1"cEc͐"l2W OIWfc<‡Hp8F@'ٟe>1toow`?xdkYa,M^79՟y1>75{'XFfI}~^1Y{;ގ姃?_^T]oJ4Nv\߯| t~$02!\LtY"4X& fZIQTyA9I&Vf>8`lm j]oK6=ds5`ZL?nL73azcaE3GWa3̮F=yv(__N`(_M Kg/mn;Z;%ֹ 9ݚLc݆ѽ ~9\Ƃ$y3To0ޠ14\IlAD-JW;VKp@+UVvfY!-WY/Vf,[HDg@tY)ɧBM}W:̊X_yp2KVdWU$J BeP/*Z22X7 |9\]|h9^-Ia:-A"`0Z bL@2˗}(\u/]B}}:rޞ7_:Zл5sWC%n}e|ȕY:X|FTݖ;ZRH|{}أ\/ؽ~efT.݃ÖuzSƈ61).0v:)U0*MoKLؔB^v2`ڡJ8% e(BQBQFjG# kSp@rU=|" v]a8~oƁ(aBPxBs@)J5 TA1 9*LYFk+l2_ЊY> Ų_USǹWn%Ԩ^%6܂QJmRU52:L^mT[dN3mT !FHEz"uQm1ބ˅FU'\ oTY!e|Q:{ʡznҝ s9maTL[$w ,%&&ޭ :jT -0T@4 ܫrS!WKDW^-õ͗t=6GAk>M[ "p%-c5HI9>pKG0S&=.=Ym&:Ys ?, WR%݄/{m>ϻIϳJkk%RGP]@X!trnG ВP~W8_3.:š\άho WEànzY*~lDAoxqa}5շu0DSASiHM=)頩BR :EMv GM=SLGף`f_b_/YL+ Ntr}]7Kl\->޳ A⃙}{E]VJVM!`F r(MU>QޙU?[%nn2q|&zsWbJ{Oo/-Fn褺?WP7 XvՅCbi*!Voti* ^z^B: Zulpu14d8  TnhD.3E!7piB 'XH HJR0x0iR`׳̉^ѱp=2Q^3cůU:*pnW+,ZM5*.h)|IynB4)XQAUv&vE%\A$`(0!OC!2b!yńm+{ANߺ=}c@+{޷ƾo.qD١.!N.e!9ءeqp ')JE2lIEPl# ¸SUe1Θ|z8ˑ&.C+U  op56@Q/s46@k.Bjχ{gq.Rˤg`!n\ ι#)- J]0LsEV @I}Oa?$NdmYFC D15NgwS]msF+(}oEz_X۲9%YWd?\R``1 \P"D@RKAZC9Imo(8XB!l$$ {i6H=/=̯8>̯nA ^Bw /r nP0BJxV· xVBsguHJhY#$V3acY* 4 `̖VYpbqR©LL%B A!<P")dJ+;OceGbPX9Uo>!DqG# 8ay" $T|Cyo+a* ha&:GLHǓ+݇wKΥ #?GXPa?pXInt2DIp1+0H2JhQCBK#uBdBH"lYqj8EdmQRݑjSpʵ*INr' ? [BgjpQ)BtPEvsݩfHշŵڀP%yb[I0u {Z5O 7/pĭ+@D6dq3J֍)[K gQ"oEE_k˕/%˽H\"![wIWXxm,ҷANj],o6w1]Eo^UMY}BgT{&˓az[` 5e:yʿ(?-ma+6~qz@ѐ\E;x8JkX7khݺʠt}FGLok50uhSև|*I%EGܲnqкuA6q&`g:3Ժ!_v)F.n[7I"2*&mQi֭3M[h'5[-Fln]eDu>u;`ZRެ[gFZ>4+WN:%ѯܲn ZкuA6[ K7֙Ѧ֭ UN2f*EKq,b$ӱ!2 R<,Tk&wnf=UykJQB߶]4spSd6c"l] < GUV!HY1E}^ x3/0VX}` I΋ݥ1giZJskD Q cOl "@.=^-,_M7+yRIUѣ44iBdzS\L/qynsOnݢssY)Mdҥf-@7mV]U>TOpni&/c s$MpEꅉ@ʿ^pE?2t~[{9`%Ǐ0nnp6KN>z[&x ;yoo;OtEʭt%oA'S|z/iSޮf%߻0dL5(+s  Ք.lUn1}]5ř!\Lt 3%R8"%eRɬLa1R,Jg8VNPΝH$ד̪3+8Z;}WJaRjV"zH S^,R9_L"+D|Lڜ+z"ntk^{SMN~s3;-y=yPO߹| 7\gfڏnZ %֕tZX77AV߮ߝD ol&p>V`QnAwkCНpݚQu5vk[1vkԡp5g(Z4UbE^SEE-<̂> LVtv~/e)нW['fI+WO\>w _/X|y9dgeFSOIěWƷ?: VR1`XP1.̧X]L,b(=o>j^7rC4?9.QrM3AXnl̵Y˵_ZI0[=y>L{=9/vbf+ T >jZ~/Z{Xvuik#gRxƛG/+bpnw܍D)Ezxp ;8XAc)Ne,ձLd)wmJJ1~e蛷߽d]Q[% U`#e:0^5Zq, \NQ"HT9T3B(&M9r2Zl"*'_~jȽU kg7x [r\.xViC!5 AЈW=b,F?$Da 1Q`ࢂVp,CQEQދ Z@ACІ+tµqd1j< "` Q6`IFGm RCUZ aa`84:Q)! 8! = l4Ԗ" lb8%(B!)~a" ~ՊY/J6hWz}wE"N烈ETRV @4| # 3gf˳!n|$t/G( ˒]q\kqc-jU6^\ns[AUrX#B$Q6;,Rq⸠VEKr^kRtjMJ'9.pFCz*=%VzJ5Tr5F+[ߑR` ެQ ?'<}9]'n,kY8e7@Vk6ҵx!r۵fPRp!Rb`)r%PqX"bjR,W$Պ G%yľ=õ5q7*cq4)י.TsCFmZ@8,pɠXd%VIWJY)|$P@qIXJt)B"P(|) DZ@@0fK @bBCΌgFƧL?['_o*NnLj".[SO޽lw ZZ Flj:Qh7g7DR InV@ހ bw8(mcEQB8p %E! Ω 6 ((B8, de. Kп!ătEUA=mͻ_`\y s (L0*6ʹ~xs={ 2 ό 0!3~$qf?8{ >TCf!3>HL0}gr?H1p(8% 0 )S !%@+4: U )C mPqv`zza ¸&Q`H`Vpқi H"IQ/ M9Ei:ГJl{&O;ќ ab%RK:0XGJ" {D#inv))0ͦS"fKb[!^pL_ө #$ڇ!Pp#M#4M$6Vbu"S甥 qJrjK4)J;;ח_"T!o8p;g@ t  'mMKpć|CXD+bD"1h"-ljY+Rg MJM}hM71t;-٩5;EVL3Rj:Zcw!|ސW/Ԃ#RpT0mt+i>`iE?hFy ¹Z?&P0S!n%"a`?.i(I̾>i;iLmRe[}jU?W>?Å)ZDb0 4åӨ8],Ln{*.Gxwj X9z4=_dv;eJF]#aF;B pD0-^lgDAL,de+)z6K*bwgn< 6qJs%Эp y'wŲL֣&\t4 &89Blo_|u٘IdsFɯçĥ~8Ǣł1@؏U>J[l ̽@2JɄHWfwyc )-04}K5͗wg^PX=ˊ?eدw|Fޓ6m$WXڥ8*}ȓ[.;~xNpQE4ʖoH tTL\B@1}Mw8(j3ٗ & i'G4Tp;5y5Y(w)}*@J-l),*^S_I¤Țòp+,vvpW)wޗ0%+ʑg4/ha Y0 7u40r7@N#[Yܧ5`sqK7:^Аؘwb +&s>FQpV2(p~U:I,%vf1h =%RP{m?-梓gOnRb hň瘍T52'p!Z3V1Y6A\I tõp}eYԒ2%O`i)fYn@p!3WʼS5;fΖyܮYz=BϴT\uoYA#U^ iXay8tSxߞQ cօSlYe)]YU+[v*gr0,Hȍǣd.Q;c<^\BJ|AYCU q0pklF\uc/,L5Ͼ_` sP?>|pv?5@I MywDe|y fLYnNԋgV$TpZ?۞Sj_WZ2m+Y&ZƼ>(>’Q, >ᩀ&؆XIgOëGufrQ%a4\Z90}X1{: TPXW%t7O{b {d%HK,>,S;1Cgţg+NՖo-5Gql B\k W/?櫺&K傂Id4z Q`H1KYJ%aЖ㒨}m} +* #qym`W ۈ69MNοr9Ԣ`?u}lyd*09R;4IifB:,j"e0Jꕂa9CmԒ1z0_ `]vٺ.>G \* %ipl]c /m't M뱓$B=5 Er'p_p,- ZM(P!d*@ xOnv[iÅ}$6ܮ(~OZ<>7@H*ɑ6eWc7@+- xF}v#BYŇەVvdcW--n}\I,Evu_[Rp֝t,%0yg5RRKi1 Zm;Dbl> ^KTxa, јkJ4n4afs?4o}0GDG"WKU~A C9*+*ni"o7(BkVG Txu4_c:f"(^gSq'p=JҮ,͝Hq<@ ORn]@"$ݚ/W\Pmu.MW1OIiI)TMkC*|YSd_S]$(Jlw4$ωx}rAlM)8b:/+KΛc!DD7GhWT3l79 /:|ElG S-lrZz*.!YCer|my$Z`5Ŵ"9P29EULpiLu);SSLYvYNb0Y<'u}DY7-ڗiι[w%ta)h}D`KgdUy80ZXl_rT-`Q]O#-/ 0d;ȷ*+0:E{CX pP(ԣny)Dw%/ @abʄjȤ0Jke(QWZ2!#<>\(lօ8 t,~Ttܖh1Ȼod6Y툃mEu$W=3Un0T7g,c0 K$;aD{KL[۷k$JL[CY2`BPc@+,&~;>e\-KB%"+kwԳy8H _z~Fa٣|<_[<g$?i,`mnoƖz<(yNSEפf],o[f/DJ(X+F8sꤲ]F2|<-iT$ PBWF8NLrHnӷ(w#Ǒw] ~{Տ3'I4OkXMgwY]PsG,\3`n#8M0ɓ -&52H<7~NH5Lq<G9oЃ(j:ƺM5J?_c3ApuӘ 71_ wTPTuWZrvj7xI|gFEl.^ z5?8=_$JSG4O|/1}Z\xa~&nq wD إNPŎ͞ZD> {9{𽯕Ԕ~ΒPwJP@, Az!eœu4WJP-Ҕ˂^ǻ'giV] !֖ZP"5G_ %9ǪPIjZQj4HvGs*)Ds}bIݝܝJ驘Ds>Qd9^5]5jzGc S jz!եwK=k>¹J6dׂw.mxxce_ۦVsk30ҕAylm~ N>.Y=ju>H?z?ǞOLBNe$2! vA5 ԅf`AU ƒW~85[@i>Ssp'>g@AUZؘ9AӚ32IN2[V 1n[=>iue}4BKolj2ՠi+iAY0}y|I$YU~OA^R4$ lbni^ F}1h9W{IR0r@_Շ\JGؾ#Akqt>Ԫ5zo`×=E vԫɉpqh>o7uW& b-,Y%2Q r/sNh,i?7,٦(S4A#_D:Hv]@1:bx`7k^ v|EќnX. ZWՍ^Y8yF]\^Mxyyc+ۡ2=ۿcQ vVq!(E8"Sb%TUT`E[vj =S3$ޤV) FGY۷3 JejyZq=W஧4%!F t`g>|ؿRo۾)񧠤^~ƛ'h4p# ^[S6Cr%,bn&Meoe2V[b_.H`/. A'yu3=Q- e!Z<=OH%EČ>`?#$gkPA’V+YL4f>Gq9`憉 uc6Uk@Xw2-=@#)),*m]W C5P°v8|cSf*lN^M}:.rMy.w,s;N5v*-U⿆ZuU4 Tjk=/vyowf|)2! ^%(d7^K >M \1g> Yu$F4\79x3Lz&4o^e鮶 Fj}bCcc6C fd_j޺Y%b~nT(%N7a;n+ ;3uBU𦂩C@p wUXrD P}7N-Er,B 7?0KZ*KkUa' * sPfa3.N>v Pnx}D\么2ŁϸPzږ{q8.Ñs0$<; \kkm#GE`m/0x3g'Lv_vay/#;IlZ7nulAK&bX,V99)"@2b2hf-Z3QPFp? 79hcs=ႁG}9jAJOתCF#pT&8^ob\ ],;-./{oßudu0}\oWh1t9tV9an]V7C;>&­D̮F ]*էa)_7qU0q1Oa5w*$CҦ st2rgw_q6; D:[r#$~7Hb&=T/*~X/ ?_GrA[cꝺ:oOh}d!NOm֒nC._5^nMؓڔ+=YosNFf3>ngqmbξ̞]|$l!F<o1:COYhQg0zzU@Zuy k"OriqIV&lcģ|'$\+NGRyFCx?Y&B2vp5;w42yɢlft1y 2!RB[E؋Ga k6E r&K$J2&Lj)tfV>dL57y$־fHr!T<$$tZ3kV߀C &F"*a<Ǜ~f$; I_"xW.I2PWQRy#:}TnG<@mkdBc[E4Eʉjqnkc`' *5J!E` `w-$䕋h}X/$2L/z(6q"=Eg(*p;Jׁo] ri{,j/Ύskc{f[cspC;ivoZ^]j1w~3xߥU[R|6^%mP{ݺsw9D\iQj݀Ϳa/:GqsH'IJ_K5PĒɚ* vT"sp$s9Bs Å+Q'2vTam8܍7uCH$$\BMFm]J+tL )Bգi) #g=8Cpv+>nG9 rlɝPCns9+ˌCrIr#a!rFB iN"ӻxQ}EsGٻ3Zxӻ?QGYmuױne2n3wPF5\vkrIqkZgLT0.n6B ew )W97~  uw'Y!=N& Jndj/k:o0(` },lc-W~jb2,GvsGB:pQo  2cDWh  *­9>ln ۂTecsQe[pL%t"MH+2%`W t?yӌuLn9'E&*F܆R t3`B4弮^'ki iQO-o]afR]xT&9\yҭipHBhN$\J5q9- aV:i:VzL .]ԫM$쀤ٵ%U$P;`!lyL%y?: ~eT<;PYRݓ$Srt\(l>5O V.Cضf DH Z;h{{M7aޔz<|ȔT rlqg ;r&4QFB299Ҏr#D15G]%z9ڊkk8] Mj~u;>N./. 8Gs.!֪KH(kj4f".w!i26}|J~9?ﳤdo "Yo,:m%U-E榨i6i4+uYmx2Bs݄@{&$@.d5/bO8o=0O3F[/.B! I_7J fW߇~ZNݣoV,!gϼ>s]i4Z ox[~f&ӟbys{> p<\@bŁ]xe ">m~璍;lh{u&P8*l(=o7reo.a~1"z_N"^b0q9#> s/eW4*B 4DMQSyLV1(YIZXw-g/``PWRA!o ^1vzD FBs హ\wjNGa"a<Ĺ\Udl]{뵬 ;,t?_w3ܱaAW,[J67Ha=rl^X(> lR)`d"3TY,(g -g3Bt^\JSڒt֔8<6AJ"P]o_JҰW $cِYgBy Aa֔C a#rhORPF(h0Xk6P#Z=/%0&yhw̕N - #}gj8+׻qxuP[?dn77HUnnGW53gl=s})cW޶pkT)S δIn>)'M"*wycG qLVXe:)([voU7 6.]hC 4*ʫ]vR}U_ԪqmkR@wu{؊C/?YEwnWy\Z=<\%P;-;@| bL^ӽ+ B׎lXٌ }m_N 3w+|H^z29@S |WP[>di> ;;%?C;߃;c >7TGSg`JH25,a(#!9v|CL fCj3؟> Gę ϣW1~3 3ږ_}4NL@ً*4s3?ra/~~豳G#z6_*yo޺gw惛]GOQ0#W2[LmѮ#ǹ-A%jڗzbtzw{NBL!?20z#e,et-b&3ד;&&!N]sBHKm%A8F]n,MӫWkQ0#ZmT8^iL'(R%Vȟ2ZYxW蠀Z00 /h+w` _@c+6@#8 vec/ Hn{#OAnÊr6:zDp5#h$4!@&LrҶ ®}U"}Lq;$ch0 ]YG ꈌ؈w<6!$qGsAu,·N]pa@;t*F Zu#XVv}uҗ'Ex:g6)1&iQC}tʱa9Y׍f7tͼ>SnSeFٝ 8Aedut0gSc ͽN% uv/WP<6ptvf~if8RXWM(X]0aAT+S5 yl/\Y_`#G`a4P%a!\Qj1*0*I.κw7 `8yi!f+-BdJ {"\ &O 5ܡ)00Ə> Kq1u7!a6WpݚOW.P5y 6y^ˬهjvLJ7? s˩ ntk+^ǯ~߸033Go JlQ׿yLƷg[X3Q?Ʒ߹{d8Msh d_\ tF&)lߧDcZZ(^_@56"7T# p7Elaʔqq1`-dW:Ōújp uWgNѠP'yO&$Sg_P(]OAt&0OCdh)9'E'A @f2wdJ1ɏ'NS7 2(AC ]kyz2*:KVAY=QxSA^Q)e2@$|6_'R{*JU]j_`@-=2#L!&: -k(**,rZm%L1y[brz: ǷvQہ^ޖA8?Ž\9Zc]m_m=b-8c`C^NLJh)ITs/ &#<㠄ck *uCD+1bT?g]%&#klM*d"?9vДÛ2P'-%dgDhɵT| @dTYcwr$0%A' &,MEebq^nKFM?D-H9[TtVǵ&+riaьq FaQfA+4*BLc_JF#s(`Z>2+/Nopg0 [Ty4 T3̈́hG0`goJ:/0hv#1y?X<%zJL$G1JBLW^QM ,TޱhT+q2=!8F%&Nc#Ꮞ>%D2N$sOLIލK-%sm$w*1I< Rp6l [uh{![MCl 䍒`dEA I Rbp` :A BaMHHG4uZ!{Fτr$,$+$4Q$` _HyἦFeX ֎ψm v2L4 fLa=6Bz4vT|B9 ɳv:mD$Xo\.",D[G I_hm0WXXPVAH8Dyk@"S)RPi:%5O@PRz4ڿo#+Aժqr6gr0BssLN>1iDGp"Qp$3"m< RxZ1fУ (!< jn|:6Hsx ݧ2 {N+{x_w[pg얎0#dU7XI 64 @㌯)=1tH:9X*>Hbspf?O\)cSc74g #QY[(o;.@ 84-e$q=r\Oei?۰gvLJ&zws-f"A4U.ʰC9()vNB$pm ٻ8ndWz932E2@ۻ8.,^e9e$%.OqFZҴ=iImizuc#&M09 ШYwkWwaXAFo:h#Emj0+d_0e/RԱe)EՕ%HkU)ܹ[XF1EFxiO܅ġ,NicAiM K۔4%(^E&ޠ=JD|v APgFk!I`+>z+(0VStS'e*Dsϡk}tEhzQwWwPـr!5*,"6j(?l4D1Dƪ[ydJ}c/ BJ0Uw6j4+քd mp˙跶iyLpw1(FC1F4\3|wF*VWᙇCWå oؾZD4kޟt0'S.Q;1n%hfV|0ȳ0]mnsۦ{' Zvpc?xaJllf :'wP)ś?Noo$Wqy+ܮkd =e?c7ޟx|uxt} {3[oGg{|OVp CsʹOx_P闣ci Bl|?5s[A 4*̒ ɐ7of69輶!J\wN1vyq7#WW M :ڑύnAO`|o@]QlMPB_|`afM撌};%n\x%|S#qEr&A0A x֫pkY=Ƶ*zxk^0^rӧOrH봌1aC5!S=%UXtOЖ=^ -I\`ncҐ 1CWP[;oKnp㵐äH^@&w7TG=EaMi?|дģlcn9z=u+eG&lIq`y jF"Ψ/T症e`C墇*sa\/ѵi섣4Gr/[n?J맱 /96ϔZɿVƝhC&mZ/lr8E9afwmKEe@(uPpE ]BFC3nh}_zP7?AJhKfԽaO+;$NR.J6i1"/m&Vю]VeW>"tVGy*D%Տ=A Ad1$YTGY KU"lxЈ 8g݈jd7nŬ V^3 K{F▽/1]ՠjիfAh7xL ~GJ3sեU( -ڢ`Z:IǨ Gb킥G*6Tx-Qp*JQP1BA 1rΘ҄9h%\'QJ- hyvvֳFFM9['J8'ϠI3QfBL *`H,J3ɂE6يH1|=w.W2;?~ye\6!aiB L)nt6d$gRE1J 7NЄfEBQf %NR>:`;SdkXud5>j+sU)HHʙOV9@#͗ MhQ4b#C^1\0"1|Q·{l7ozY9T 25۟_S{GFPŲlstzxkcs=55F)h*ēս8R@I{wrf/.OOeG;#[(iЮrSTJ8v$aIc/tvE"]\./ZԱJL^WIkb`]%#%{GOFdIq@dWWOIlW{~!BIZ5lflӦPv>m mdF`IQ}uu|:ltaNzjOX@ai*^' dw1LICF\>X]3ȁI}Lpg?\R-SDO5Q55;q74zIx0qs!q3o5IN2zdubc $sF:L?|gu~ulotݲ#,-Nfg.7𘯟ޫug׸p.%B|__V~ !pa rm(vg/W{A&Fςzs G-DĜ5f 2p5ɕ{NЉ-زuǀ">Yx,zn+fTĨ14 (lYW&n>5Z57h[Aq7k&~kq SF2 Ԧ`TUIhzeZ[૙ފW\/q/@\?_ْ̇ {&g*0bS̥u!Z.<}ڡٮE.`;D%+tK¼e/"R7Dd"ҌM:WĢuippEiw^0d/qpYpgAK^Y7;lwR^M%ˁޯzOh;$*=_Wuڣ܉_Z4K`QNuޟߒR2G8jqDZ9x99f>~qt6gI;dNX;DhRuynY N:g^kPvKݒߒ.(^4),n]]"7kPr9Cl Q|kiV4/+TP஥FdE{AVME۔fOO H$d721Pe~[KD6 XkJ e,>Z4W/-_,ŗfʷSGj.>UGe﭂v'\1*HJXFdV&R,J+dύxR韎._@ 5;<(]ݦh~U*ֲ(r{^VcC a㠼i]z CndZLRIq-m2Ո1/]I*ncZ4H“j`0}fʼnkΟ|:iR7bV*HͺJ2󗮊f"`3"'5Ȝ sAEk\s[~]: FƧ3>ivW!pQ,1eg,N؈.bSkRI3HK9 ~s}*e,ìߧũ|;V/C I5IByGAd.Rs }jb\Fs)y}m%H-Qө}yyN~g[0ڙDװ/_]؎bjEb/_En=9roSZx MJJ ^.`U{/7jȻp+w _M8,=oѿ:\|bpt%u@f9/M+)'hDWD 6YuSc&O `yHO J-Vw\>}| 98hwʧ9ez]1`U'ޅ$QR$crށ> u 4̀%Ҥpψ.j-KL)B YXЪPB]ՠFlTHȥ'-6"1(Jk`QX.2hhd?* ԠW1ni9MPt98ȁA<G ("0|r|8. sL+7T 東\‚N^@UN^2ߧ S;N!T>N1q92INCO!*Ir:m$4B yCJ[8y ]CiЇӸV38+F=( Zot+]^$Fgk`B6\ܵ]}5Ͻ] FJd$+JINZQEkEQd% Pe)\U_Fz" T *E=2T&s\>=hZϢC-er ɟ?w^{$ZL^?= W~k˻pEB*j|}sXN?k1s|E\~Iĉ9f늿3{=&^3ǫc7|gy*"1ƽ仿p#1PєQif ױ|qzuj_*PPJ^2Y^(XP<'ۖ1导v;9G{/;gC]{NXnr<}4ZB[c0 8EE;ȝ"YQb U3APk?ztb6WwN '2?'ER)[ԗu_Hyo7'[7ۨ`T7 Z>q^ge@sLʀ`b ?{W㸍_{X29 ,`MKpJ}ؽ+vlKedLi<YU$"CL8N%T<5Y3"}"9W(I$elSr-Ic*A.AcP,DE ,` Yf)g (7e}io3|A_A=̧q881LlD 05_ r\j^8)]MU&8[l LA}j蹋f_دEeoP^j0`T>xPrͻ͞Ch O+;Y>m6l7_6w);%N_MҙO6D~0/`iOkY-(^޷o_ȝ ~̇i(&;]p[E7rj07`l`~s>qM;C ט L ^7hߋe;\\)ַf>uѾc};TVO 7[>LQ7/Ӱ|YA&̬䘻-JV+%L a4 5XQXIGj-A@i6)es"8%&cL%V(L&3J,E:~Gg UG3r0߭6/γCok.@+3W%KGQ qxH=Ss]k]R؍:Zh.<1S>_.>UV K<[{|B\c[3wEI0y&sy9}0/k``k ٷ;v<]Ƙ!/ êG 0'Dx __;׳_Z1JPm@AaѼ5,KG#8oBAPk(=t mz:s-y٨$E:C.iK+ՌFM f8BbISi̺L`D3Nl^:\K_پk6.oP!T#CK*t=x)?Cw{vOB|u'ϭqf'Cw7v'nm?|wXx]GӍܹn A沐AmES.4(dt†OVsZ IK]݋Xrm'lk0lrwthɒy8$V$kJb2&HfNF(N q 2Jݛ)G'&bjK2?@6ϝ6 g  'A@G)S:IBCg{ZP m ɺ U{QM@4ǖBX 0d<͓ [CZbB(A\4IaRhY+t& ,a )ʅ!+`fha;SIUĂ-r- 0[$@muׂ 8C$O?jٗWXFc;寗.[Y 40$*<4;xh9slx3{w 3~$[-N;sww2*| v@~oֺEtǒK۴LnyCRHAP"3^*3X7W6s;huYaXsP qR)cF5==Z)}TwImgVaϜ*gq25a7$#5U)?dW1qt]D n0msQ#M`o>=VU0 ZsSLq#0z$](8!4Xk;A0Δ+nfv3sqf_V0C 8<2i2L`0rw,bg:F DJ_nPW6~4;SAws)~!N5)DPL,r "4!**%N3hbLMNYCl5f cvֶ >gY\\F"{V>Q}'SN5y cGu*%|-{w0֩PJ= "L1e2i,Ҫ/! v֡{J zK-t/ݙ 'T"\>awmm VL/l=!u7- hJ\a ~W;3,`^-†0Op3(};Ώчqo|N{b{_Nv;-ܗez,>:"i=6 +{rLޢ=I[TrQ3Ti!Z6='|C9)lo =JbBۄ@vS ? 5c;pHB1h|My'r8g;zr;SO.b +w= ! 7w MBż %1I$$cdjE9NƊ d1J[fUbM JcmI[P,D$^w++7uJp^OWԝFߟ]sh!8eDÜ,29.h!WY~0!R)MaA0+Is)2 ESR(nνqf!Ӹg"Y $. όYFLۘ! g O gu㼨YYFA rI "M@&a03A2\28,gyy\m*)J aA8E Ǡd'P*ʁjf ٺ BTo!/ߣA[!21d>j(׭/S5/0ZQQ#FQB m@ұ\;Zx뼵=VxE7Go%@A+V ~YphDuեF^c+Bt!ȪGB.@ #0ϝ&"39SBU!YGӳ,Hn6w^(DsoW~ӊ5 vAh c4thÀPa;|f#~@%ƅaF^z|hgq|o\=ZK£oCȫNLhL8zcG07p ?7G$WG((oJ)'X?qq)V\`IM,f<˖v?>FGu|ϯQ^@C|w\)x5} 0grH߫QCpn%vLEAE[`x&>ZGg7Ob:t6FeR~v.W᪔͟K2M+CNVǍ(}Ic^&xA2fؕ#NrR4. m"4cT'~.Fl0ǪaV]ᴿk}(י<> COφ |!G4NM̗蒎 +N/ლGlZje\'\1%E{_,%} DBͮgEZWڊi1t={?߳OJRw X'@`>/Uy-3p H:(8l~Uz_'Z[`m qX/U>޵ =#m<B z׿6CǼF0'& I9E1@&]fipCSQCveu$*qqX^P/+)&#ǒ9K\K,BIXy6z^{YpvQi [^mV{ݶ+6UۮY]ϋb +y1xl%rW։a5Ix~ٮfٷ_"-lL`& uYӈeE,"TQKcU&UO͹R3rXQbdj`^mJߐ>h~0y=5b_M'A׸y3ܛNjSodݰK&98s^#tG͔UUz\{Sq`DP2;J>z]#^BKȫE4ׄikĸvӶ]NWɉꑓ6^ĜԿFv c|;qFF(!čʱ?U"ge5;,;LXw0ԯf8j9* bE09JAxlU}+/Z{\#?xyI7/Z2]%G6}(7H4h9K`Ҭ2xyQMک݂?b`aKcO _.sn'Aϖ.p޽RzCwIi.ܒk*݇*eJQ7BӦg*vJP|&mU s¥.3}EpeeSbt~R%Ωc|\Y=TBTTaq z ,z3Rz=" \@P Nq qN_ 9Qr_"J?ri◭O,8Q力<*|O\e;-dyUI~w2',W)9$nto4/C , E:8+6uC[lԟ@K;+˅_vaC0;)7(`v8p-vb|1ܕXR[CacB?dBCJ{e?pDb#*? CB3J:${d_aCђC j8e]J0f.r&vvHjepu$x2NO]pVuϤ@mUPvVu.UC[T>Mak.uXۤ+QOt!XZ](k.VTV+Zo7Xb%d{u$gpNRb׆0vMq6)fbdz~RKLfiŰ~6-H(|;&k_',/Tx1rNTg,Ozn'a^ :5y|OL,- "3 x7aQ~0F84V r1CglXLHyDOMCZNNJz}mwŠurwcn2!ʭݚ`H))qmy-Z?[LW7딓L'k6S4SRVލUƓ21EglWLdu37ݲcn 0Ԯs|[y7]9wŠurwcn0#'\nM0G2`9GUĠ֐m3*N7rSf`H))c6ލ vY\ P'x=:TC-DMCZN,Ni~s 78 행A2©ܟv^UW5S4SMQbP:ƻp^9ԩ{n kz&rfp",~¦Ln^{7і1:yV+́@7@oZ9vem \/i> V̭~nn^7і 3u9.hKx6Tr̜j-scn%@\3Xcr̍V135k]9fc嘻s-SNO/%W]17:u\sfcnE=Xw9.hK^YT߶嘻s-A",O/, ]9fAcN_\~\/o¼l"u0=i1MʿSpD )Kx& GF(mxzj|2֏M:w{ҢVF|ePV~xs݋jF20w5־yhjy|Hr2NI ztBa3qkd 3bCTW!ʁ@VZWBҏQ\'tp50?+׿کn>g=,Woӝ;;_l$?Lx["ܷkg$:<,1^~XKB=>sDep/3 58X6 iRg]Uj(m '8mNI8|GuhbR,RGKjʂc BC0Nb#M%!FY(k42%#tr h@5R7`!"a- FI[Z(# sB9!Lq) )JW֢L9?}9%O4t5hxp<2u p=Y&o HnBA̠0V^P* ‘֍cB X(#&5<,рT eT1XX(R g4`<&>QB:+迃/)SO27J!؀H4#B XDLpB 7 5!gJJPɨ2zL-N9^^Eϗqr M\.rfx(7ooo3ic_t&U7azDp>F+~z|v;?,ZN'}~]޺ߙq9 ֵLU,nX $^Yy 9 L$ծۖ8ΙtHԓf bX Iܐ&."2"t1O_oc6A] dR4J\4&|]MxClw%+^eff~* oc| Y? h^݈ w ++k㑡 CC՛ֹ飪!5+P5xo5L-Z:ѽm˥pBIb=Ug0*!F$ 1BGQM%ܥaa] /+( #ֳ8z<:љ<\+^籹Kgg<"V"I"@nSoū)Yu( QV ҹ|Lߖ|9 7c \rfGe^89KHTep %ƥ#bHv7]}(`}{f܏qȽyx)j}x%ٕoF@wUsrY(&t  W`*Y=QWe6\>9tI%\|_2;1K~ّxF9@Y9"ah B$D/hc1pBDdoUb)u%E9Y-*'ed{{jP"S /%cg9*btc~?)JR!ﷸA][s[7+*fjH 4.VT̤M^6S.\m%!}$eRsxHYŶd@/ũin[뫴; $Մt%ف)CKb-,)oHOpuI~, X2y*Tƛp_N7x>y]35F~1)5߾gldF.H.ƛdnSy;r!y C'r~#Y?Y|ܚe`adKVPor׿_MoiB pO*-k&=hl_Z/g ڻ}{qj- ;?[d$SVe+s?=8PQ8‘5 _Z(T[vKkӌG(:NfR9IzS7p\3+ udxKK4?@9p{ 0N($ڕ׾=8p`]]FMͣ=iD+:z4Hf}N)S4yV$RT tqè-ܥI#@ԑi,B 1GeV9ϚK t>F2j+] .2~?~+*+ 4jIU{, PRx9r+X꫷oey)0itYnWn}nWF$+hwu9ÖB87>> M<<ř1,ܫׯbb6%8.և1JnK+5o~ngT5ov1Hxo2pX f C_&j=WT6< LkTZ*XcmyݥJG-5߷ Qop/}PU\=D:+=^L'bWzH?j6N,âvO,l1lp_dPeR4[oyWss9:~|~bp`%3'9{53{'Ky4 .TZ;YZx_ zR̦%p"W%?̯v)_߯9Y雳62^ w_)E#n+ ]Y8y\8ƒ\\b+Rq'\vhmv@ӉFRqplZKRP‹hu)EI`X Dl%v\n?|DA ΔT3`Ldd>.tn.w[ZN(y4rRXk]v XB,H[ Z 1_̂di4i:VXZRq,^Hqd-eWX%MYlTBuҝ8=Fkj 7 l\$`݆qYJ.S%6$~eFR؏Q[ACN]WN]|zp7ҩ-Es;4بwWڋ}:nSJ}:]]M?StY.q/A 9ʎ{ͭ`W6>:"3*ch^09ci-. B }ȬAuSL^; ͐"d2xeU$sq5Ṫ?3&0FT-3<.;!+UcCfVTּ#K(7ǩ1Lr5?M)gj&ؽ _PNGRk{:\*O:})&EN0kFt<zsz+&mATҗ*mdu^zc"}.}V'lye(wC /XpqoQ(PʝWx[{Zen%<j-:HjJo6pOဖ{]*S*oFgHt*lrdd׉zl\Լރ4"WYq4Rw@=YWA[ be|LnTIR3xhb뜙*mRN ]!uI$r)U%5mquJ?X} $`hI yd]#s.kʆy`U@#vZWQM?{OnNkU\oyljU%SZys<Iwc 9]Ig];.)>. Nm5ZkITP风J;{-+9H<[#|6 r2rZedsIWٛ b? k]ݻmU??;!mg Biϥ h Ad [`N$<2. gֳ`k9cE,+c`m?EjLԚ.0(AiiWgk u~OĦG3LRRGU$++ Ug,Q @_[%`I$c"JV,B&,(l<'$ Ên]9рjމ4 F=| F3YQ'ރ3LHnD:c>*qEH Ρ&[6ĉ3(ni Z[5rIJVYllCGjU@ib&[g;L6˵|QfDd!fLI[12:pig,{s;MCЮ`W.z9mZB3ͩF 凇18- !{obw WzO}`᮫/z1foLʒb#-'IՇіwܹpռIE ٫"jkwNѐwhSEΆ=ϙɱVkˍ\Vp=8Wϔ*lW An:^^BߛZ@-[c䑙[^nӣ82/)%$EFD&\|SR61DCڀmT{]?Xc@,13QX584p/B&kCA`dg^&g8Fgs{C+i쵥Xg: gxyBll7K ):(7 RrEYDL?LeDOVRcɭQ8[ZjA oVa[BXX#f{l7=4NFjj1HsoɷjcޟOg̙,]^̾|UFz;!5x};}_3IH?f)uQ4+:Wgޔ+I|>@O[wJ2ˁ/2;^_dz1 Mj2٪6kVeNv:OL+!} <|$ yB͢4V:`O*)i>+k*m R0;s$#Xi2`EcLD]{C~+ٺ_1Y28Ёe vmx!tcC!y(7wkÐ o@S`KAqF0[{Z,)9(ɀ&pM*Imv%ii7gJ -8QJ}iQz>iN,I̤yz>.޳*OzvHD~1Op*ݬ9T$M[ȋs0/qd9,3Jr &ix`v;48B@|Ba-aefX]2d! IչgEzPq%n.)qAPi-mt0V֛ ʚStA7{p.Uy'5:Z26@kǒDQniee0sONQ|tuxɾ(Dڨ=upRJya̴w64wļ'tue8˹`ÂcG z## 2ȸF(QD>ٙ +%۾zY s& $1X_oridK {P!NdF3،m2IPjn0ȑfw|y"pKРR`Td@)?zMUiI1}kv)ʼnZZ1K"9䶎D'R[s)Ε0@Cfs"Гr/DCN OJ^HEBL1!Vmwl'oyw :Ņ:ky[OrYDDbgfF{S2޻qD)~rh3*Mº_`I 84ݏ0b" ?k@P+,V錗8炦TjWtk9UVcnc&1<Áid2uDaP\6cR{T b,b\LrYU":bƘsLeYD)@ 2/AcK++ai-Re'^km.]w}+ ?/ \"cu  B?!9qV<+Q9EhZ+!O,µW\0'Y+BVoZ# !(K횿Lkc[9_NGM-,Ї0y ף*_rZn6/PvqA nPcY'{LBP=U}["߮]Sj`%ݨLfX}zs҄ۜڢ T%ΐW64`\#Ka w[ +̸hcos{$+qYrI'G~S^ ,W|U0P9t!QspŪWQ נ@D@+C H]5?aU-κ߷WSteIyi'Y&E͍Fn"B--frYpJ%Y$v:heK Ke:-/=|Ź' PS|RhQ[~nW z|$ "%:EWPEXP߄R7 dE]uU@̏]*m-H 24Bsh(MѴ:Xs4b>U O4[u&k3 bej@] ivJ>nn"fQ$R2dREqD:qCM7|n@yL:m&0?)&xz:<] ÃA[čnc(CfyrRH36MWJExd>nW#JE#<\8q0N|̎`=Y޽9g1x$8GҲeZ r(Ƚ=͂g+߃m\^Ħ]z9g]鼔r˜j bf!CpVJs_[iWLJj߮ P+XGU V[vsp& )S\ d zk粥%Px9krG>%Q$Y՗+Ł$Xt9g|p+v17&T1iR}20|k:>J,j9M+ŸɝcUζ?ZriUteKeV>Ej |9׀D{ 5w pf6ZlS%@/ICmԾC߷ڝ>dSw{ܗ4ߌ׵NUpn6%r+pMt6`wSq<1<%/ɴS& { [ټDip8oO;.L]Kyw(F,Ow埪+C\Zr:5rz_-*#[T[3r+}:@S7mV_ |is3kypss}OϽYKPН'G{Բ,jxx8=m>qv8>i3-!?'=h۾9s8b}Mgw]53ozc~'׻'~9=~}rvt;-$A6gY:NMmjxe.#ݳ7Llx`>?i}Ovl+;~ޖiN?r8ywxxqp/mv!h?OCwƒhtR1|Z3Hx2P(Ǔٻ3кxNSN{>m馔u[F+| t?rxc563Px]5xoP%M3qW0p{ؚf]7gG^ a:Z`t$tn#c(f$H YcA**D1Bʅ`E 5"*pP LczCcZi5"cQ 'SpQxfA$Aje҆y ƭ3;J iu7"f~ЭS uRxqx") *TO ?`bB"aaBЦk@b.@HbGj"/c%<-k;j8X luLQWG!i+%8k3`7s%Nw$1pR$xUo)`!P"5k,Һ?ۚ,oE,`0"cI 7c :>|cl/ {Ã!l݅g)"zxnMڲیWvjvEh{͋fp]^k ZF֮ 5:8H['?7/E=ɩPRC~iM{z3QUMO(g\Lhe)up ; ?^ۭowoͯ6Ol`Nsi> i%]A 54>]"2ݽj7GngEs>LBmU:|~[/q;==zW_v]zc?)5)pN_e}MSK zhF/|yR=I&I~е%o?Ү=L*{; 蓷a hwkr ևa7 o7B~^l/FSt,j2/7F͗yZ@g57s3͕ߊ>Y|>_W !\^~?4ёO'ěoAT _-??S_F@/Όr6<i8# xO>ϋ%gMfM>Hc߾OeQL?-ᕃ&T}/BPu gsh> Q_0l"\7gYՊ]/UJ^~խ^›^ɵ C̭ITd+섏t)>=2NyzIop:OӾ\*w ⻫ՊElw]߾jV*ͣRsI1|/㜩<<˛mx>u\d\iX, HF?z|i JUk U~pKo ؜V_c{4k>kw7s[N%6\o>9pJ!KoÃ^L>oDV?ނqf\FJə zw.=BUf|ỡ&h)jcDQ,'.D\iQ)q|ȫo< 6ɚa` I%|sT!XtdBŦR,Mahs>׹ЖήƋsd :9unUXc&KǑ(rʠ`E"TkR1hjVyXcݯa LYV> h LSRM3&T% NriT3xz Do%^>>#&u` N Ȥ AQq !6a2G\B1+m G24N/JEeu0pT1YfdLٯ#nA"DAdV)"QAiIP^c( HM@Xr`yWtu4-nbj#KB?hiH- zDB$QX+b$2Zh#+ PXS ț,d1#jstw:3[}NLۢHד@-Q)kR+oS1 X{m_ ˝X X%)_Ȫ2ĜݷmQikp2ΖS2,K~x7Un>Rߏxa3}H7\r~Y wy4њ Y.:XoV?V`pw7p 8.f~æh$ nuA(| 2X3f(mbE!'R6Rsw f8e*FF*c6 Xѐ9քb,@.it5YL'?!=~S|B?ݹ,'RV1{cyu3ciGO?'1'̐{/T(<{N^ 9t\^@T[3ͣł !lj 6(FPc sk)F!E`GJ TkA0y1@y<$%pxLX'!f4"Ԋ(8#$ SNsU# B DD/u#"k435YN\(i?CSdĿuLCx|R5Tk(_*P rF5 ^)B(2 ub%ʄH&!uԚe }N'MFanVwsP{M!كPNVC/_g&Zpp?LSnFw,0`>'Y(:%/lDbMm-hy֢@ZSL6WXڶe)`4PM)ʏs%=j6#`+aA{@M"ڑoO)R[B@D43H`̕؀R 6,S( & H~<թDK,>\d΀qvz5eʄfy1[6zg8JX ;][!nNڔHRI°, O%ވ\FNU'D!J~j}{=H3跏M[Oi0'$i_xO:csU6zR+c\;B=QKgWi@s =fԀۓ3+S0ڻ>QS)8|[p&!,%of+)OE^}J}c_5|@nybwMM^TPLsilipTΜ83,XƃR`,ZqΥ4 ޡ밀 21wՌi5H٠]%4*P),{L׸"]a5PrXnoSWUx҇TN[ e| evy2eTSJ^r,^2Q+5~9dFmzoG,-/'Қ+Tj˜HH `-[_Ù=D ƄHőC<գ*fr6`ʜv՘˜G|DjmfC*Uc5{CXhcLpg8l1g|ad)D\IdN(񧂢f)#h0lK O%9xkZdSᜈGQ _v"畤őw}k5HH! QSDz@(נH=JkK^)XhWڶʟxQ<C4 n:]ܣT⦷EzSe4J- ͖\7p '\V5(dlMixY7I|a)yI"aYUSgB5k,ԏP mBR34Ov%/D`~TSCa8nP ?<,n3(=H'usoޫgݺO%ϮZvڄD!vvEj5A TMK_F|nyhǿ?$t?})"C9+AfL6uBYY&ԅƳ9#&FB# &(pQp0*/)as$4"og '6%iZ')߼ɜS|#x>`E#夤N#ͨ;OY{ HׂuKT 7G? f!g[>D#8?|87Z1º&~fiCU!vE;ӄj::jjqjЯ"d0K1k"^W t8W-Kw\n4zQBJ+Ofz-YHN}qڧ!')؂lu0g w8 ZrX~<#+q`xo{SbdI+(PDm{;4@]78($IMhLyЏZ& zR-q yVocRX( H>9tja)GA'ǟ̧V|ro[ mL]|*j7@f˜Cn89x0ok$4ZV璞ғ燉a*}#V0p_)Mt 8]At95h:֙:]u98! K lBv]"G#1?X]w_B:KPsiKtM=K(>&L..g!18ͰZviZJt[.n+vэ 1u+[5T ҉$1 \ۅ c[Ry9XtW*cKNQp!9\Kf1pIOŹ#.6T9PDɆ0q$=_sI[8!RJȃ)*N@ @9A;J WX_z+KNJ=n8vQ__RϦVu4Ma0}{;P(>|\iEO+}Z+r7N(P՘*9LIs#rqGk+PQo}47o7$k76覗1F|/d"v{F}^cYmX]êh*]ÚrpkXr,x{66WƦ_ m´NYi]JllĬM}PHo#}Ĥ'1}%$n9ph)h]QS癱1 ǖ XPXjQAw9tzx߻ԓB֚5Ye@*R`Q4FF{vbJ8^[(UF^SR&WLϨDzLtnB$t{f\JZ7of2# ghy3=`9$Ky;l0k4-O58ɠH@7ՠy^Ewn+!(_;4s|eQX36pFc.b0R@Ԓ[F0h"̣V1uXl-?#  Fe&ADvIϕ ܨ J *(L鮱eA=&Arƫ0v=wS>ٲ<%)MI子T{C1?f]tkM׿NvP@_;B0JXQj"j{zVH:bhM48Pֆ2 emI){h ̅k7;̖i6; y~ßtb&icIƣpQ]?66 T~ N20"9-4 ]D#l8DHUi6d~iȍ.Ȅٗ* `3e%#VUm07 S.p0Spϧc{֪FVz&̇4->{cezW̾IL_*ouHc};70= MN첡: a|կ*Qb>єO}kG` &\տJ'wOʲW5O}YђտJ8UP(ÛrJıtҋ^[l=uUfnKڊ6^%x9e/rnoZј(uBFyِv_vO0 ;0V;(Ɗ0QV78 \ĉN8~mqq:Z;s wlzdSRBW{t %>^hB5ڭ ֭Tгf"A[#(-BLz"gr\{M(ʀPˆC $ٮCy"I!nz'/|d1㦞rFKveF2vjG\` B><, P4( B6u\hl'1Ǐ]7a@'곎?F?'4( ˞fo4!7uMH=m´: MXU{@F[FQ{qYy37GڿQhNXct{7ؼ(kʩr34GE]3)%WZrفGl!8FY>aܽx7Dx7(~ f"ǘ,?"+ݳ ~x~('YEwW_~t~Tn:Kay[s4MQ(P#QNb\*2*W>T%Lo~-#0O}ztή.:WL)yQL ޗlʁc`!a:HU,p5s[9pW4Rb[N.N:Gi,/|Š $Ԛamz  6`Ř޳D3ٔ@s6!7lgOٽ~ĬI֓a&ttg+ͮ\*i*X)=c(?>֮/+D6 wXqpڽBW}0" wF ~o<'4_6jy{6&_nѽyQOuo 4m%`mdۚߑƷC X~^R}`^޹h0T#e$^oC.w+=1BK>:_T#֟F1<uYc/ۧݷ0)c{egdz sTT[66J'{Y{SX~ Q J/B)_$U7{&e$C"tdhԜ!=G0^I~wP\_]/ѬD P#I?֦w>4eOJvi &y?_"n^~76"Ғ?Sruֽ[R07Flg4d < Sc/1=Cp56"g~܋\+L(dD?1WTWt^FСD$Хw}0j [E,"Ln;n-d5GY})N/W7 kO=Qwpj l7SOX399/#lˆjGnu4_NxI+Oۗbpr4c_燗t.o`֭oT.bE4S4xX$z.@'gF6.:WԄUpf$y!LߦX3IʦM`P6u5kmdÂF(WK{\",݅7?ܮ1by_JjvCl(U<0i {w0V*3F>nX o@zr`VSXJ-WծzaIuR֖ FikG}dA9Efiis)"ǂ]/:-BɈ_I3KCxH7X)|Jj(Fw5\:4LsE]0\-Kߺw}eA"[9e#H,$m"X40~EK02iZ=,=}k~)p31 c%ڰR[ plE2(dwiIuvjoEd[% +pP^7 VRzg?RS^L+Pֈā| "ĹSuH.- -m/FI59,ԔȀĚ9>)Acv$KfOkUJ`\(*II!& SDgHe{<" LNH d -i "$PG6#l/V lW;+\ dz" K^ NhSXiߗHॊKeM Zw>=ۢצwO%M% 6f`oѸm4VQВb7<0 8F`*I>͓UQAOm0}ڳ%)VH/1G6-O2].eTQ7S2͔JJ_¤tB~#$*G$sn^i5:|/ߊhtz7}9^Ղ:sJx,t/lOIXBAq%J~#׮}oe7j6֫?XJ2{k7dsCd}C!VYnyKmNf6faŁfZx}G ρ:L&6꼟c O^O WlQ|Tra\`'dUzW\N.Na3V}D&׫c'r gj=hЎu!Eu̡E8~f|~ìZ9qrR#JBE<]<;X(FxZ/)R<99C0E0H9R(`[M,)T84tvrH0;Y Pejg^럫 F]Eszi,|Rp^>nCc_)ԛ^jN(b|tM|)rݸCT՚~N(ic1'6e^DY c4SY5FS/ 0b*Q>^A.D"&ux]P^ZhVugm&8(/aåBe)e+, W)JSg53H}y|p1#߳܄5oa?]s-vV#O޼tHטRCrju0>b8 ߽kw=@ pHp2 b޿;ZGc>;Mָe3 3)gv=nr)z-h 5B,Pi 5ߜM(FVǤ8>0e:K!50bs)8 V}1#{zc.kb%kpJ0mTCiÍ7Yhuirh\緿nكk*NI:\*)sz, Z }V¼qWD oB'lc0"MX1,~Zt0g*ިd:{P#\̃[D Z*2)q`C5AeoifEVB/e0FhmT. @P$X>} +xPA@6#.Dy { !P\d9 L {l:DP Gý}0zoA2715T\8BUl|U%!.,|8$(9TWAH)-6pS0Q-f]&bE&|sW˕ղM7o}Y9XrXqY)O/k-. ;4M-m`No[< {vf~bL_ }`نA뻅 U ?ׅ9aN s|wס&!(AҎ/˔` ef~B$[Ik(`e׾ر v<,űwyg6Zd*Q2Yi|e=jSࢽH>ē[\]w5@ n㭮 4]LwzG*I0ynyiOQ&J"1Ā=Cy'>&U0Z e)I'!л]&dhQmh?9jQ QEgZPl4 CpU+eg- BJ2ICTC5;^miS˂G1ŹfKsg:l#$ sDva/e}Xn([{tEuI |ʽP^`xL'ihGD͌6ނD[J ~8sZtvaǓ!Z"g1[$;ej}L9R.x҈R2BqBzÅSQY@0D Ic?~0Dp7!c+xYP-~h i;m맗I#FF7d47ӰpZo`v?a*NjMt$kua^f{ά[G&YˌSixN"CV0U;Z1l<0bљ/u=:ڝߛ5"JMTyw*PUE㢻X#0B@̣0WZc$oz,aUX3aF2/ӈ<Ef˲ qYﮱ@x rZ:([JySy nyMWHڄƔ&Oz" 0-RI-BE'I4! 4Fgx2ƍBrY1&*X.r OE.7K}Ϸ i\RPSaYt<F*7^J-+&,9`1FFY8D]TxB#BdP&EhgiQeثBmqrW\cޫHs$; G{U1h€VDw6 Ro4h3xȞ܆/<+*R&[A PUl|T[* %%l*,MWS0v x /;F9<(jȒA 1gU\B3D$e>mDgmfQ iGE9 3{wfF13wY #̉C~;gp"ڠ`K2bpĄ2T)pL㔙"gԓ4%HXd!YvC/@3R+dϋUz64"![i`<_^ڮ{9qcc?4l$gm[D>,Ez{5^ 񏣷NhK^saҮn6TsxadG{^V|=ܸp6 QBgZۣwZ⳯ZeyUi?ܨ@j6$0Kܨfaj.C<ųmq>kxw/^=uigF9+^Әy+J7?ɧcvCڔ՛v=}$*iwLWBɋ^xtyUq95¢׹ls߈E|Ĝ0 _NSmL@O7N?0NZ`0qaO{-pu ȥqa],dix6ͰGe"U 6j|w~֋}z*сFB P$_'p oǶ~賫 ['at` ^{6WR] A( 8y0i,ó~3?my݋1)oaHn{. b hYv|.sph]`:ei> ۣzZ30iEj߁ d7̣}' :|v)r|l}BB/q_~"{by~2phM2 leǫ=2, `?򽯍p$~~ jݻ)PqU ket+8B  GOB|  MZ()qiV8ZXvX+ԞGFo)j;6@н^khRS4Ned5~b=<Ө ?k#C=uJ<KCؓ̀J&E!I[؅I&, 15>|ƷBІ퓥mUmYEP0*gx5PC8T y\JB[%)NF<#ijô ? Я)@dxE T:|L Lj\(+C ˥rJ{N19IҌ]TR=ԭԈ BTsNZ,t(}VLS)XcQ2Tbe(Zi\U I0ƒ0jN" ^6e%mJxX3-+2l"Bnz-Xw݂NE*T)J%L`ЭS08*H(MWN*xZe"%z5E"K}P`IV ALy vfl;p$ fLYеwxGeFuxAZ*\nSTG&RM-h*1d<[,qFF0/cMBJ"jJ`cPM!KYȗ3愈em2L -3T]]&q# .L &!l )؈[f6` ýPXQK*MMdhEc& -(:rR}L׊/U :&wh=bt΢I &D.:(N.;3La! J)[̽]<;vgFe$iʒ$*<ɵ֧<7SR,X HzσKzU%B!3xDaY*1VzGa""m8*i)ͳ|TVlzH6KL{APi:^K'ͧ=^lD*4 cOCyoy 9aDqFYi"k0> g; s.A_CcuN LҸ  H٨gMGHQX(5ҩl\Fa"$sNLzW^ Ǭr렑ɶp2bo DU3Uӻʘ^ݭ,J o3ո[L^ R[SM*BjY5PԪPE^oNIhp޻aC"j)¹>Z<Urfovz+R+\nĬXi SY/1To- o #5>D[8qc:Wob"ޕ6rcR 3}1f2݈yJ`²5#KR)IKVi)I%Tw-Tg!@~v028&"Nn|sgSLYD,Ɨ `Row>#ȥV'779hR5UF &D$xv)5ՉنTf9|]bQA?E^m0}uAEMw'ϳ7jN#iUǥbm-\5f|?;yyΓ/x|ipF6l _ۖ:E[bv71GVFFI^~4 T#弘;ssf^+_T]0U58Zӊ|/ .@ۓEvnY؅Cro,[kH4 ,aM.77u ɦ"u@<#ATM*{0Kel^I^S8,(6t6V?q ZGF)5nOc{?$ Єlߑ |yP#Qo#xS`XZ01t6 "ߝcZ$ɭuiQgLb5͐?S䥾jstĠԱ|G)֝u9=f[RT0CSfNqzoqK7#PbPdؾtЋPmk>tcɧh9M%mYl^8mi*@-7S)A]'*/B]2(?~ϝaY2S2JLr,j۬mC,QK=;FqaԻپ@Phl}2P=4&DJE&Kp95S|Ѷͫ^mZ-|K=I:#&QήzNmZ22[@'@K(`Μ_:-b-7W~Z愛c7^h; )Hτ5 VcR>n @d+S$k/smc.pO &X@AdJr*Xl?t u,гOg΂'wp.r| Bz{V7`h3^=bVpBka@=rAz\/;L RBMC+WLͧdy֐lpbXȸ2՞hpkҜ<% (+dTw'|G*XrC\p܊|r S1دTxm ٫,+z 8%4xpxXp A~%dcg1mXq|kʃe*{ hN]ɶ֌hq7\׌L} 7[xxy׬!Ͽ@ӔfܽtjT,*FmG T92%ܩ,HY;A'JԴ:Y|mZQ.f޼fk)jv{7 xU4{ hZ_\UaGTJPH(j~QcZk [?R|D;0熈asKqB?. \) rs-G":; UTb JE<0,!f=?T7"*77K'UOha6Sl&ZSRw߻zTệ )._i%U/7O\N:c_ffe>9pԽQwέG{ $Ň)j*Wz85 _ylnZ "%U)-!=%sMa/dE]PR}Tef,2XVΟR\W4=6;&肉 GLb.U=_d0s &v_VO Jǖ {$^wk6a|T")ec&~u#][n̍ljK>." ʍϻ?]zm5 Jx]3l3A aA<ƛp#gePє;_>?|w\cӮIS=+Bhj~5b.J,W xL],Qj,c\EH+GQLl*dS2/mJgK([nY#?T#B$;z_ɶ%fSxŗywE=,*@2ѶRˆ*s(C[V&BKx_IyI-|'V";%yk [U%/;W8ҝKW|_t?.k-54_@SUޜ< ]"1=jmA|At{ 8Cz!DqU1$ ̥R D\\bza-OE92Y&W%Ȓz _bQ(TfC冸$UJ0!tY`rT[|*%BjgY>RYs))a'/a߶l ]h ?>bI)dk0JHTH%߀|~jmq<I?3sn}f|ۛf-& T/׻AA!\ߍRЊ -Lr X[P-( 2%pp r JsLk"JБ&vwٞLm(BŐGCZm)銩8&G?5Eb)Г;鯦VJqp-(Vg}r3,p4#M!ht: %99^-`ò}5m[k78{p8KsOuyi ʧMVOtx&hk~>?ƅ^Lϳt4['gw-$8ʡ4B TRz7Ψ,וLu_lA }[IC3qqqgn}gwdߣRdG8}YfC½..=ԍ ԛk'8L07M1u}nXܿmcg̝zu;k:g=BH?-(Tmxּ#؇/FȎ)$Ӧ)c"٭['3ݣR3&~`zd޴m2 ޤQ7޾)$Lig"QS\-F!&H:ۼmꁧ۬İ5Nl6 i괵(y{btYpdB Fq\#K<I5(&AD=[Mv)9YAHk@$|gZJW{9 Uo @r(*S`@ D |%g.ۣ4oz!.]'h<}BU' hYp%6w9!]6޻҃(RJ=m*%VY;euWo *\êY)+s+ۣ ԩPjP7H$} gYz/}{<|oq+IO@B-:(2 V4d ˩*3Fﯓ+LjC 15DkfJ͉!loIQ5-u3k'7k~%?Y|ēM4+PD⊳ܚR Ƿ=2TQQz$GCΒ `dlyu2xH~mJklj⇁歲!pmͬaîUě҆ /!yrG YΜ Y`& u29`ax?N\"Hi6oEhF$"~+\""sxC(=|0"CD@/@> :VKR | reA!\$ 5!Ş\\GS,QńR S^,~,1`rmNSIe'F@pٛNu^HRN(撐{E\0$$3"gxDasCBT[0wsV\,B,JxG5޵q#"$/!d1`=h$A;Sl(%uc U},X$`!Pj$pDc̹%Z8balCBJ II!~*)$6u^'9hd?ͧ1L{C,pW{b).<=` )fryvo1K_8:xAx"@8/@Ow`|ĖppA=|mey0C6G QqEO^~0Q ]_2 $pw S!$Ogr12 p{ܪc0S H@ lbWD&~$0[FM'fσd\w7`r i/9,Pɛ{ ^%pݛ^40Nz-ҩz@tٹ37g GJ݂wC+Ony[2гf0ID@.e0Әto^ 1-Ċ&|\0 %{*v]KO[@Vy@,'ϐF^xJmQZ‹s~XJҝp5(_ĺx ܅7yrKÝ]XaVĝ8I lEvriCIg: :-'mf !R$Tvq^uwZ諩wW_0qq"z@P7C//#fse+nvȩs9`xcf ȟAW-2R/cېl2JdТ۷/y`-C5HG5k/ ba1A9`bKDnG* v~X^ZF(%CKk*{t.8Ye[:?|Oi\L/qŽwbD'yObnM!M ǡUJަﳕYB M&R%_ 0x2yox:H8t^m ~n~Ke]FuW͕/ɏ˙EeRt _ʋFx| wt GZG[rfxW=UHAytxy} fQżBusD qt[:xy3e-r[^ȇDb>nqЪ9R^4mOJގfPTcq3G}B94gf;G3~ZF÷ q h,qQÄ%BJ*4\rl% dp,B2L RzҲPSe( Rc6fL(%xl]w b c|rkקz`Ӂ<*X~Ar*URNI%y%Kw^`8TNZDN ERl/!bBˣƚaF+ 8E(36H"J}җJCZ/i)DMeC^b)ASi٧TaL^ĆlR1"FD!DH$VbbKYcfILubR2%euYSu1yrɄF|js˴gnz:\xպ-+R£)8~~~&LQN6 ޑwo  8ݯ|<%ߕ_0 J3_V ;d[KK"O^~?1D "]K>K@fTqΔ{IHD\vUg̻͙¶ Jy[Cզ-̅^bqFvfTZ)p7ܰwRTB%MF?{$F$ QQ:nS揓4uMt[Ϙv:0M6҄n1IEE1h-e Ɖf' Պh(w%#JV@0pg{`8t#7.2i-29ۈL>3H*o]m0|`$upYͤ=ƃҝcTP~.rQ2ٚ`6:j>sAdckލYh0-RvV7G7)N&{'!ZƄ/qғcɱXzYZp;be,&7x-xn}#'²7E*L0?ނ-?BМ#3I~L̡-G[籘C1EeW%ĝc.$O%W<$FP%H +qBDV'NGn8bbtJrtZUt{k=&)8UMOU%7ufq *U3*/0YxC& 4mcM:6,Nsnoo~*eLh}vV(^V.6#S _6vǎ>>NQ^ N 5vfwjʼnǩZ!"jmVxdEvGhR $ OI\V h\it]m`pδ;r&5l)I"FphltQmZ1ƽiTP&+ V;6S@]6h<_ rw᡿Ͷ7fjWF9e'S'ĕx@}fƥUu%ð.Jl_ul@_|S l62zE%2hQuZ|#^PC2e0u0AL 2QTme Y["Ru;bznM%4Tdە3GnL]d3%Ko8TpJe۰,鼌R+k'RHXM_/猧x- oit"O]e_&NDquIB%% 8k" W'IyWϢ!3z之UҳG¦?V#q$) 5UaLA.4&.nq#~Fj%ݬ(?_=d,k+#9ey5Խ>4Ad-_zN\ͣtp6]i.VE5+_{Ny%OMP Yc cܨ8桊 J%6Q_*pT, dΙеEz  xDHSaC%(¡pZňH msѨa@40"BeĹ11pd 1r>VD) ,QF"FUd)kf%ƔebQNz"%2o9W茕vKPZJz#5ZއQ_WgDKD~ˊGP>3.Pe9fT+b{s:zYGI~R~KA|T[E9iX:|_dgu;|<}S8eU;hYxL>YfϷǎa(BT<(i x1m0q3K>h` ͬB%BR ' W&B! .`~eRVsͳ^u>MW?mKB+:ҳo8Hb]gM ~!8 `᮴(3(ƌ* r-;չ;Ph9RQt"{phs_ QG(Ra !*$ ?)xE"DAH ŮEu ^<Ϸe&A&wS9/oů6hs|mobnݯ9$uUőbugR4ר:urs9Mb+>q!TMLpðDm.N(#䘲N1pN%p_PA+e {W2D(2*t  *4#`"a!RXB6IHaɓ)\FuuNJ{%◆ߥ؃c>BXSca DC"##0MHjư*‘r1%9%}1Wc*:VS=G\K\-6f\sx)rBL;]ė:k6 R*T*}H>j>$WCAQeŇjݯ/20g _>rl8bTXp+Kin/(}S"Qsr`pRy2FcQT-E3֮rږ  cO;M04MLtmuT Vy<#HX(,]Yo#+^ k}pv H51ӇD/u,Ie SMRҐ"hHQîcX #|E*֤d%^8'-%m2U1 |7SG:NK?𷣇)j7Օ2=o4 r^d!++O<)W|CպؒFb/3E$+EAEϐvRymOݴ'~\"c:jayLqkʁ.pVzf9%a3Euh%6P̉,?52P oe( 8$Ec֢%upi bV-K?|MaCNA(.Y1빒IT ,1c+8]xJ1C !!djc\ru6`sD?̄|{2ƌ'WS) dfh^^b6A ]u[qE TS_\:qq+nw_IRDz_YhC]]Kf I'Bd[sG\ П:_A5^5gХ:{.莯:Cb%G=&T& -*JMC&êf(fiu7wR2=>GK@p^"vBQB%M/vj4B)8Xm1Zr3s ne) T@*\"*#}2,XWԒz[%z &V;p ژra8$8N ##N

,TrLH˵kLeF 7h @14D\lS]ʄ٤ڂˡޮc/}@7UpC %z%&HKT) H60kuQsX3B4AH B4@t}%l(e^%(VI59CYP.RP0LBB^ip+'>1P R2UQR%-24 LfUBlS8W-oގ9r£n/kuZb I=oڣ6 {tK/n(*3yҔ~|_j;cRB6S(!erFhhѦVAH 1=Dž7J-sVjb}eV oxTYIZRPn[hjՎ(i1&dz?cZ Д=Q۹jqꛬ@N=W_F),}] }|~Cim9ax+';rI۫>U֧UYO[fU֓>Ze}:VLFM0F hZ%ݺ.)EރѮ"".&"eⷺl qӃW6Ns/s@5D9YK@xJd(}ԒBܛd2w_^!ʷ7稈;qkxP7/tS3d8fߝ!`9?ec.itǕhFw@fu$|Ƈ,Ol^pzG>vFhf:|u~:t>g٧tD"Z?h?:⒌x0{d.'rd=KߧNv@}Ĺǻ4vejڔ1 xqM/QIּ51jJYo`ѨG}[&4&_RAVX+6_?$.+paOlqw@B  2ߙⵑ68f {i λ>n^9ϯ.ѠowKh|+*u Efd·`h1S6c(K@"-c01@S]̉F5z.`T0u jIEZn4O+K 9N\/o->ƶw eh.)}&4ѫ &wXa+(gҡ! `5.Bm-9 loozr^,.2<j³Owԃ[X .?ݣ޲&Fs3v'b2~ԇwC??1 #Luw|4*iI_Jz?RYP9à~\Jn9IzWW(زEMJ&%r&\1MKJ>>ѕً§F}8jY<bP!O糙=xAܥ"A U3fj I K5ł뜬uߌ2#QNJ|\]^-tY"\]ߛ'6`2HFǭFz鶫>;UEڐ}lNnr-^Uf\]Yi+լs҇Wp~VJ77ӖZ[6^U}}'WU~zZ\>V7(']QoV-7@Eϖ>ma˷0D-wQ/9@sJ:4U|Qgi?jgIS 0 w|ت[9pZ5<{4j> jw4)NWN,N){G֍2s1HNw4n;?N[~Suk!9OغIabBhbvgI֬[~puk!BJ=Z^y_|,`X 5g:e;jA@&\ɊΕ,?oh@_JlEaڌ}Ea(D@u%ˀF}8jj]%ˇ oHRRR/3>=QC瑥R i,&K>5QS䏃cPuvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003534560715135533264017721 0ustar rootrootJan 26 00:07:04 crc systemd[1]: Starting Kubernetes Kubelet... Jan 26 00:07:04 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:04 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 26 00:07:05 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 26 00:07:05 crc kubenswrapper[4874]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 26 00:07:05 crc kubenswrapper[4874]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 26 00:07:05 crc kubenswrapper[4874]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 26 00:07:05 crc kubenswrapper[4874]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 26 00:07:05 crc kubenswrapper[4874]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 26 00:07:05 crc kubenswrapper[4874]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.388359 4874 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395745 4874 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395806 4874 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395817 4874 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395826 4874 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395837 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395846 4874 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395857 4874 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395867 4874 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395876 4874 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395884 4874 feature_gate.go:330] unrecognized feature gate: Example Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395893 4874 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395904 4874 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395918 4874 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395928 4874 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395937 4874 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395946 4874 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395980 4874 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395989 4874 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.395999 4874 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396008 4874 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396018 4874 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396028 4874 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396036 4874 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396046 4874 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396054 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396063 4874 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396072 4874 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396080 4874 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396088 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396098 4874 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396108 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396133 4874 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396146 4874 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396156 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396166 4874 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396176 4874 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396187 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396198 4874 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396208 4874 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396218 4874 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396232 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396243 4874 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396252 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396261 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396269 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396277 4874 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396284 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396297 4874 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396396 4874 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396410 4874 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396420 4874 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396431 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396442 4874 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396452 4874 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396460 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396469 4874 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396477 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396485 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396494 4874 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396502 4874 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396510 4874 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396518 4874 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396526 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396542 4874 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396552 4874 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396562 4874 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396572 4874 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396581 4874 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396593 4874 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396601 4874 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.396610 4874 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397150 4874 flags.go:64] FLAG: --address="0.0.0.0" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397183 4874 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397216 4874 flags.go:64] FLAG: --anonymous-auth="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397233 4874 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397250 4874 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397262 4874 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397280 4874 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397295 4874 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397308 4874 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397320 4874 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397334 4874 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397365 4874 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397379 4874 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397393 4874 flags.go:64] FLAG: --cgroup-root="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397404 4874 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397416 4874 flags.go:64] FLAG: --client-ca-file="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397427 4874 flags.go:64] FLAG: --cloud-config="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397438 4874 flags.go:64] FLAG: --cloud-provider="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397450 4874 flags.go:64] FLAG: --cluster-dns="[]" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397471 4874 flags.go:64] FLAG: --cluster-domain="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397483 4874 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397492 4874 flags.go:64] FLAG: --config-dir="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397501 4874 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397511 4874 flags.go:64] FLAG: --container-log-max-files="5" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397528 4874 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397538 4874 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397547 4874 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397557 4874 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397567 4874 flags.go:64] FLAG: --contention-profiling="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397578 4874 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397587 4874 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397597 4874 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397606 4874 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397621 4874 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397634 4874 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397646 4874 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397658 4874 flags.go:64] FLAG: --enable-load-reader="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397668 4874 flags.go:64] FLAG: --enable-server="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397679 4874 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397704 4874 flags.go:64] FLAG: --event-burst="100" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397716 4874 flags.go:64] FLAG: --event-qps="50" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397728 4874 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397738 4874 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397749 4874 flags.go:64] FLAG: --eviction-hard="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397765 4874 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397776 4874 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397788 4874 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397817 4874 flags.go:64] FLAG: --eviction-soft="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397830 4874 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397841 4874 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397853 4874 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397864 4874 flags.go:64] FLAG: --experimental-mounter-path="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397874 4874 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397886 4874 flags.go:64] FLAG: --fail-swap-on="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397896 4874 flags.go:64] FLAG: --feature-gates="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397911 4874 flags.go:64] FLAG: --file-check-frequency="20s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397929 4874 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397941 4874 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.397952 4874 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398006 4874 flags.go:64] FLAG: --healthz-port="10248" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398019 4874 flags.go:64] FLAG: --help="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398031 4874 flags.go:64] FLAG: --hostname-override="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398042 4874 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398054 4874 flags.go:64] FLAG: --http-check-frequency="20s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398065 4874 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398078 4874 flags.go:64] FLAG: --image-credential-provider-config="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398089 4874 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398101 4874 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398113 4874 flags.go:64] FLAG: --image-service-endpoint="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398125 4874 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398139 4874 flags.go:64] FLAG: --kube-api-burst="100" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398152 4874 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398164 4874 flags.go:64] FLAG: --kube-api-qps="50" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398176 4874 flags.go:64] FLAG: --kube-reserved="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398188 4874 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398198 4874 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398211 4874 flags.go:64] FLAG: --kubelet-cgroups="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398222 4874 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398233 4874 flags.go:64] FLAG: --lock-file="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398244 4874 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398255 4874 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398268 4874 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398302 4874 flags.go:64] FLAG: --log-json-split-stream="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398340 4874 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398352 4874 flags.go:64] FLAG: --log-text-split-stream="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398363 4874 flags.go:64] FLAG: --logging-format="text" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398376 4874 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398390 4874 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398408 4874 flags.go:64] FLAG: --manifest-url="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398421 4874 flags.go:64] FLAG: --manifest-url-header="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398437 4874 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398450 4874 flags.go:64] FLAG: --max-open-files="1000000" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398464 4874 flags.go:64] FLAG: --max-pods="110" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398475 4874 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398486 4874 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398498 4874 flags.go:64] FLAG: --memory-manager-policy="None" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398510 4874 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398523 4874 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398535 4874 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398547 4874 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398577 4874 flags.go:64] FLAG: --node-status-max-images="50" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398588 4874 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398601 4874 flags.go:64] FLAG: --oom-score-adj="-999" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398613 4874 flags.go:64] FLAG: --pod-cidr="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398625 4874 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398643 4874 flags.go:64] FLAG: --pod-manifest-path="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398655 4874 flags.go:64] FLAG: --pod-max-pids="-1" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398668 4874 flags.go:64] FLAG: --pods-per-core="0" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398679 4874 flags.go:64] FLAG: --port="10250" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398691 4874 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398703 4874 flags.go:64] FLAG: --provider-id="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398715 4874 flags.go:64] FLAG: --qos-reserved="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398728 4874 flags.go:64] FLAG: --read-only-port="10255" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398740 4874 flags.go:64] FLAG: --register-node="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398751 4874 flags.go:64] FLAG: --register-schedulable="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398764 4874 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398788 4874 flags.go:64] FLAG: --registry-burst="10" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398801 4874 flags.go:64] FLAG: --registry-qps="5" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398812 4874 flags.go:64] FLAG: --reserved-cpus="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398844 4874 flags.go:64] FLAG: --reserved-memory="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398864 4874 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398877 4874 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398890 4874 flags.go:64] FLAG: --rotate-certificates="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398902 4874 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398913 4874 flags.go:64] FLAG: --runonce="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398925 4874 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398937 4874 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398951 4874 flags.go:64] FLAG: --seccomp-default="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.398998 4874 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399016 4874 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399028 4874 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399041 4874 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399054 4874 flags.go:64] FLAG: --storage-driver-password="root" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399065 4874 flags.go:64] FLAG: --storage-driver-secure="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399076 4874 flags.go:64] FLAG: --storage-driver-table="stats" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399088 4874 flags.go:64] FLAG: --storage-driver-user="root" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399099 4874 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399110 4874 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399123 4874 flags.go:64] FLAG: --system-cgroups="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399134 4874 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399157 4874 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399168 4874 flags.go:64] FLAG: --tls-cert-file="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399179 4874 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399204 4874 flags.go:64] FLAG: --tls-min-version="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399215 4874 flags.go:64] FLAG: --tls-private-key-file="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399227 4874 flags.go:64] FLAG: --topology-manager-policy="none" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399238 4874 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399249 4874 flags.go:64] FLAG: --topology-manager-scope="container" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399262 4874 flags.go:64] FLAG: --v="2" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399278 4874 flags.go:64] FLAG: --version="false" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399293 4874 flags.go:64] FLAG: --vmodule="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399308 4874 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.399324 4874 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399628 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399641 4874 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399652 4874 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399661 4874 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399669 4874 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399677 4874 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399685 4874 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399693 4874 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399704 4874 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399712 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399720 4874 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399729 4874 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399737 4874 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399745 4874 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399753 4874 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399761 4874 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399770 4874 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399778 4874 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399785 4874 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399794 4874 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399802 4874 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399811 4874 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399819 4874 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399826 4874 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399834 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399842 4874 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399851 4874 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399859 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399869 4874 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399878 4874 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399886 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399896 4874 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399907 4874 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399916 4874 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399927 4874 feature_gate.go:330] unrecognized feature gate: Example Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399938 4874 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399948 4874 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.399991 4874 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400005 4874 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400014 4874 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400026 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400037 4874 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400050 4874 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400061 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400071 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400082 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400092 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400103 4874 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400112 4874 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400122 4874 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400133 4874 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400143 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400155 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400165 4874 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400175 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400185 4874 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400195 4874 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400208 4874 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400221 4874 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400231 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400244 4874 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400255 4874 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400265 4874 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400277 4874 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400285 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400294 4874 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400335 4874 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400346 4874 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400356 4874 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400367 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.400377 4874 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.400395 4874 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.414662 4874 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.414736 4874 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.414906 4874 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.414922 4874 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.414932 4874 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.414941 4874 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.414951 4874 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.414986 4874 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415002 4874 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415012 4874 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415021 4874 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415031 4874 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415041 4874 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415051 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415061 4874 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415069 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415078 4874 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415089 4874 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415098 4874 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415108 4874 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415116 4874 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415126 4874 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415135 4874 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415143 4874 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415151 4874 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415160 4874 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415167 4874 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415177 4874 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415185 4874 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415192 4874 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415200 4874 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415208 4874 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415216 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415224 4874 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415232 4874 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415241 4874 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415251 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415259 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415267 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415275 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415282 4874 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415290 4874 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415298 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415306 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415314 4874 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415322 4874 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415329 4874 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415337 4874 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415346 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415354 4874 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415362 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415370 4874 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415378 4874 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415386 4874 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415394 4874 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415401 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415409 4874 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415417 4874 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415425 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415433 4874 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415441 4874 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415448 4874 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415456 4874 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415464 4874 feature_gate.go:330] unrecognized feature gate: Example Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415471 4874 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415479 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415489 4874 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415499 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415507 4874 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415515 4874 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415522 4874 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415530 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415540 4874 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.415555 4874 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415813 4874 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415825 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415834 4874 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415842 4874 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415849 4874 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415858 4874 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415865 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415873 4874 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415881 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415888 4874 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415899 4874 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415909 4874 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415919 4874 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415928 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415936 4874 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415946 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415955 4874 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415984 4874 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.415993 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416001 4874 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416009 4874 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416019 4874 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416027 4874 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416035 4874 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416043 4874 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416053 4874 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416061 4874 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416069 4874 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416077 4874 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416085 4874 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416094 4874 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416103 4874 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416112 4874 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416120 4874 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416132 4874 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416142 4874 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416152 4874 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416162 4874 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416170 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416180 4874 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416188 4874 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416197 4874 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416207 4874 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416216 4874 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416227 4874 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416237 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416247 4874 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416322 4874 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416332 4874 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416342 4874 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416350 4874 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416359 4874 feature_gate.go:330] unrecognized feature gate: Example Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416368 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416376 4874 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416384 4874 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416392 4874 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416401 4874 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416410 4874 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416418 4874 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416425 4874 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416434 4874 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416442 4874 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416450 4874 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416458 4874 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416467 4874 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416475 4874 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416482 4874 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416491 4874 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416498 4874 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416506 4874 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.416515 4874 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.416527 4874 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.416863 4874 server.go:940] "Client rotation is on, will bootstrap in background" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.421243 4874 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.421399 4874 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.422346 4874 server.go:997] "Starting client certificate rotation" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.422388 4874 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.422595 4874 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-19 23:35:01.925389968 +0000 UTC Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.422697 4874 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.431246 4874 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.435014 4874 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.435637 4874 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.457499 4874 log.go:25] "Validated CRI v1 runtime API" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.483307 4874 log.go:25] "Validated CRI v1 image API" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.486610 4874 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.491566 4874 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-26-00-02-29-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.491655 4874 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.526903 4874 manager.go:217] Machine: {Timestamp:2026-01-26 00:07:05.524543173 +0000 UTC m=+0.258649750 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a8c485be-6bf0-44d9-8c05-f7322b3cb7a0 BootID:e73dcac0-015f-4575-8403-69ce0bd9d896 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c0:8e:aa Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c0:8e:aa Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:da:66:c9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:70:51:6e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9a:37:70 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b1:cb:52 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:56:c8:05:4d:d8:83 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:b8:5f:a9:6f:7f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.527408 4874 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.527646 4874 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.528659 4874 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.529115 4874 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.529193 4874 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.529582 4874 topology_manager.go:138] "Creating topology manager with none policy" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.529602 4874 container_manager_linux.go:303] "Creating device plugin manager" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.530024 4874 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.530082 4874 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.530521 4874 state_mem.go:36] "Initialized new in-memory state store" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.530697 4874 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.531727 4874 kubelet.go:418] "Attempting to sync node with API server" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.531769 4874 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.531812 4874 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.531836 4874 kubelet.go:324] "Adding apiserver pod source" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.531854 4874 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.534304 4874 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.534916 4874 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.535052 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.535053 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.535234 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.535249 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.535922 4874 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537006 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537051 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537067 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537083 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537105 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537119 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537134 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537159 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537176 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537194 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537215 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537229 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.537511 4874 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.538336 4874 server.go:1280] "Started kubelet" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.538860 4874 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.538858 4874 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.539339 4874 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.540577 4874 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 26 00:07:05 crc systemd[1]: Started Kubernetes Kubelet. Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.541807 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.542011 4874 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.541996 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 07:17:38.955010249 +0000 UTC Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.545007 4874 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.545089 4874 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.545558 4874 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.546039 4874 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.545022 4874 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e1f3732c01443 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-26 00:07:05.538294851 +0000 UTC m=+0.272401428,LastTimestamp:2026-01-26 00:07:05.538294851 +0000 UTC m=+0.272401428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.546689 4874 factory.go:55] Registering systemd factory Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.546741 4874 factory.go:221] Registration of the systemd container factory successfully Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.548115 4874 factory.go:153] Registering CRI-O factory Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.548174 4874 factory.go:221] Registration of the crio container factory successfully Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.548272 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.548336 4874 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.548378 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.548406 4874 factory.go:103] Registering Raw factory Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.548451 4874 manager.go:1196] Started watching for new ooms in manager Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.549187 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="200ms" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.550480 4874 server.go:460] "Adding debug handlers to kubelet server" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.555279 4874 manager.go:319] Starting recovery of all containers Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.567729 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.567845 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.567903 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.567935 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568079 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568161 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568188 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568213 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568245 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568273 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568301 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568330 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568363 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568396 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568427 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568474 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568509 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568542 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568573 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568603 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568631 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568654 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568676 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568696 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568724 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568744 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568768 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568790 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568809 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568829 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568847 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568882 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568901 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568920 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568940 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.568990 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569016 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569035 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569057 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569076 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569096 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569114 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569133 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569152 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569174 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569193 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569211 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569232 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569253 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569273 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569295 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569315 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569345 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569368 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569389 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569410 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569432 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569459 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569486 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569512 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569538 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569565 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569591 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569619 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569644 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569670 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569735 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569760 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569780 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569800 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569820 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569840 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569862 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569882 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569904 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569925 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.569946 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570009 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570037 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570142 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570296 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570342 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570375 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570407 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570439 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570476 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570512 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570542 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570572 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570599 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570629 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570658 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570686 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570716 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570749 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570775 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570803 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570835 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.570863 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572309 4874 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572368 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572403 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572433 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572471 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572507 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572560 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572602 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572638 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572667 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572700 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572733 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572764 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572793 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572829 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572857 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572885 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572913 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.572941 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573008 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573038 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573071 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573098 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573128 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573156 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573184 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573214 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573239 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573269 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573297 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573405 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573442 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573473 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573500 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573530 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573556 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573583 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573612 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573647 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573675 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573707 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.573737 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.575577 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576030 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576059 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576270 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576292 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576500 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576524 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576731 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.576771 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.577414 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.577546 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.577567 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.577586 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.577703 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578333 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578384 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578430 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578458 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578486 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578522 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578551 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578588 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578650 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578680 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578707 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578745 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578773 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578800 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578837 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578866 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578901 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.578936 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579040 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579085 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579113 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579150 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579179 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579205 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579241 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579269 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579303 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579331 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579357 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579392 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579419 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579456 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579484 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579517 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579553 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579584 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579621 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579647 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.579677 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580010 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580049 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580088 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580117 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580146 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580225 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580256 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580286 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580323 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580352 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580395 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580424 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580452 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580492 4874 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580519 4874 reconstruct.go:97] "Volume reconstruction finished" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.580542 4874 reconciler.go:26] "Reconciler: start to sync state" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.586466 4874 manager.go:324] Recovery completed Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.606170 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.608391 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.608447 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.608460 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.609725 4874 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.609744 4874 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.609773 4874 state_mem.go:36] "Initialized new in-memory state store" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.616694 4874 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.619086 4874 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.619137 4874 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.619173 4874 kubelet.go:2335] "Starting kubelet main sync loop" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.619232 4874 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 26 00:07:05 crc kubenswrapper[4874]: W0126 00:07:05.620276 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.620328 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.623003 4874 policy_none.go:49] "None policy: Start" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.623903 4874 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.623989 4874 state_mem.go:35] "Initializing new in-memory state store" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.647492 4874 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.689779 4874 manager.go:334] "Starting Device Plugin manager" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.689885 4874 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.689910 4874 server.go:79] "Starting device plugin registration server" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.692441 4874 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.692536 4874 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.692834 4874 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.693046 4874 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.693073 4874 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.700812 4874 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.720371 4874 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.720680 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.722034 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.722089 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.722105 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.722357 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.722742 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.722853 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.723457 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.723497 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.723512 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.723648 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.723743 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.723777 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724169 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724229 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724249 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724848 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724872 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724933 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724945 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.724908 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.725008 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.725197 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.725292 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.725337 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.725946 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.725999 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.726014 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.726135 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.726156 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.726167 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.726184 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.726390 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.726436 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727261 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727309 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727326 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727332 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727358 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727369 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727685 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.727725 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.728455 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.728494 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.728510 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.750385 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="400ms" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783517 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783561 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783585 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783605 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783626 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783740 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783887 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.783917 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.784002 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.784036 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.784076 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.784098 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.784117 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.784138 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.784168 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.792735 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.794448 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.794496 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.794508 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.794542 4874 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.795086 4874 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886131 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886192 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886215 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886232 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886251 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886269 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886290 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886310 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886326 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886318 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886343 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886360 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886379 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886396 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886409 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886413 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886440 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886443 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886461 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886486 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886537 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886563 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886586 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886609 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886634 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886657 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886680 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886703 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886727 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.886750 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.996011 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.997491 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.997530 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.997541 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:05 crc kubenswrapper[4874]: I0126 00:07:05.997570 4874 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 26 00:07:05 crc kubenswrapper[4874]: E0126 00:07:05.998129 4874 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.052134 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.079749 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.080552 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e3def3fab7c6da8b90118522551ac73ad837a0be03619d9ce876fd9dad5e7bfb WatchSource:0}: Error finding container e3def3fab7c6da8b90118522551ac73ad837a0be03619d9ce876fd9dad5e7bfb: Status 404 returned error can't find the container with id e3def3fab7c6da8b90118522551ac73ad837a0be03619d9ce876fd9dad5e7bfb Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.087071 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.104712 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2d82faad93279cf5206a101b25ff54c030f96e9c066f1dc1b72abf4e3295a6f8 WatchSource:0}: Error finding container 2d82faad93279cf5206a101b25ff54c030f96e9c066f1dc1b72abf4e3295a6f8: Status 404 returned error can't find the container with id 2d82faad93279cf5206a101b25ff54c030f96e9c066f1dc1b72abf4e3295a6f8 Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.109425 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-46b62d820404bd8e5d92ccf19a35e200ffb0dcd0067aa6f6247452c055aac700 WatchSource:0}: Error finding container 46b62d820404bd8e5d92ccf19a35e200ffb0dcd0067aa6f6247452c055aac700: Status 404 returned error can't find the container with id 46b62d820404bd8e5d92ccf19a35e200ffb0dcd0067aa6f6247452c055aac700 Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.110264 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.119832 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.126679 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-19936882c353c06a024af257143bbeb7c5f8fecc2d5d00bf564868dab9d72589 WatchSource:0}: Error finding container 19936882c353c06a024af257143bbeb7c5f8fecc2d5d00bf564868dab9d72589: Status 404 returned error can't find the container with id 19936882c353c06a024af257143bbeb7c5f8fecc2d5d00bf564868dab9d72589 Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.151997 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-382eacd6212b9ca09ed2f10aa3ad78458e5b9eb0e461dad57003d6b550e6f29f WatchSource:0}: Error finding container 382eacd6212b9ca09ed2f10aa3ad78458e5b9eb0e461dad57003d6b550e6f29f: Status 404 returned error can't find the container with id 382eacd6212b9ca09ed2f10aa3ad78458e5b9eb0e461dad57003d6b550e6f29f Jan 26 00:07:06 crc kubenswrapper[4874]: E0126 00:07:06.152683 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="800ms" Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.350197 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:06 crc kubenswrapper[4874]: E0126 00:07:06.350312 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.398565 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.399988 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.400038 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.400051 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.400080 4874 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 26 00:07:06 crc kubenswrapper[4874]: E0126 00:07:06.400624 4874 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.540576 4874 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.542655 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 14:33:30.633515911 +0000 UTC Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.595115 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:06 crc kubenswrapper[4874]: E0126 00:07:06.595311 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.627013 4874 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c" exitCode=0 Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.627092 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.627291 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2d82faad93279cf5206a101b25ff54c030f96e9c066f1dc1b72abf4e3295a6f8"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.627486 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.629188 4874 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9" exitCode=0 Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.629263 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.629301 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e3def3fab7c6da8b90118522551ac73ad837a0be03619d9ce876fd9dad5e7bfb"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.629683 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.629743 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.629763 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.630023 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.631120 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.631151 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.631164 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.631311 4874 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00" exitCode=0 Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.631375 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.631398 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"382eacd6212b9ca09ed2f10aa3ad78458e5b9eb0e461dad57003d6b550e6f29f"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.631989 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.633336 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.633387 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.633417 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.635502 4874 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca" exitCode=0 Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.635624 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.635799 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"19936882c353c06a024af257143bbeb7c5f8fecc2d5d00bf564868dab9d72589"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.637342 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.637584 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.638702 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.638747 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.638766 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.638797 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.638824 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.638839 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.640820 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a"} Jan 26 00:07:06 crc kubenswrapper[4874]: I0126 00:07:06.640866 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"46b62d820404bd8e5d92ccf19a35e200ffb0dcd0067aa6f6247452c055aac700"} Jan 26 00:07:06 crc kubenswrapper[4874]: W0126 00:07:06.686170 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:06 crc kubenswrapper[4874]: E0126 00:07:06.686283 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:06 crc kubenswrapper[4874]: E0126 00:07:06.954579 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="1.6s" Jan 26 00:07:07 crc kubenswrapper[4874]: W0126 00:07:07.051321 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Jan 26 00:07:07 crc kubenswrapper[4874]: E0126 00:07:07.051443 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.201176 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.204222 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.204280 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.204302 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.204332 4874 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 26 00:07:07 crc kubenswrapper[4874]: E0126 00:07:07.204927 4874 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.506022 4874 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.544158 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 01:42:33.194711165 +0000 UTC Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.646278 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.646327 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.646339 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.646396 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.647772 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.647803 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.647813 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.648367 4874 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32" exitCode=0 Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.648446 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.648619 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.649422 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.649444 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.649452 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.652376 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.652400 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.652410 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.652420 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.655273 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c3bc10cc7913878e1279288eb135810f2feac44069670c7b2226fa549ab5a45c"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.655477 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.656607 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.656650 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.656661 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.661804 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.661843 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.661862 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c"} Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.661885 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.662919 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.663014 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:07 crc kubenswrapper[4874]: I0126 00:07:07.663043 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.545391 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 08:51:26.809018782 +0000 UTC Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.672805 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f"} Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.673012 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.674985 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.675050 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.675072 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.678723 4874 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c" exitCode=0 Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.678901 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.679148 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c"} Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.679467 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.679808 4874 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.679893 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.680421 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.680469 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.680489 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.681305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.681514 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.681656 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.681931 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.682000 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.682019 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.782387 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.805581 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.807277 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.807338 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.807355 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.807389 4874 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 26 00:07:08 crc kubenswrapper[4874]: I0126 00:07:08.827346 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.506622 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.545592 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:02:48.449469078 +0000 UTC Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.689480 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b"} Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.689542 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624"} Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.689555 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc"} Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.689566 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6"} Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.689577 4874 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.689646 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.689859 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.690738 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.690791 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.690814 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.691416 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.691479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:09 crc kubenswrapper[4874]: I0126 00:07:09.691501 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.437427 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.546153 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 13:48:32.028589432 +0000 UTC Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.585444 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.697484 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698054 4874 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698099 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698019 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75"} Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698229 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698646 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698695 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698713 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.698990 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.699027 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.699044 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.699328 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.699366 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.699382 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.851316 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.851617 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.853531 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.853583 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:10 crc kubenswrapper[4874]: I0126 00:07:10.853595 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.546607 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 10:31:00.047041952 +0000 UTC Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.701089 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.701174 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.701263 4874 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.701381 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706437 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706468 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706641 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706491 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706666 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706744 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706877 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706927 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:11 crc kubenswrapper[4874]: I0126 00:07:11.706950 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.097820 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.507784 4874 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.508139 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.548214 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 13:03:46.522129425 +0000 UTC Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.700723 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.704465 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.704513 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.706697 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.706749 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.706796 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.706814 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.706759 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.706912 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:12 crc kubenswrapper[4874]: I0126 00:07:12.708129 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.548501 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:07:57.216941573 +0000 UTC Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.708380 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.709763 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.709828 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.709849 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.798586 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.798793 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.800804 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.801073 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.801114 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:13 crc kubenswrapper[4874]: I0126 00:07:13.807151 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:14 crc kubenswrapper[4874]: I0126 00:07:14.549100 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 13:43:33.748201564 +0000 UTC Jan 26 00:07:14 crc kubenswrapper[4874]: I0126 00:07:14.711866 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:14 crc kubenswrapper[4874]: I0126 00:07:14.712794 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:14 crc kubenswrapper[4874]: I0126 00:07:14.712881 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:14 crc kubenswrapper[4874]: I0126 00:07:14.712897 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:15 crc kubenswrapper[4874]: I0126 00:07:15.549725 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 01:01:51.464346896 +0000 UTC Jan 26 00:07:15 crc kubenswrapper[4874]: E0126 00:07:15.701025 4874 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 26 00:07:16 crc kubenswrapper[4874]: I0126 00:07:16.550499 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:01:48.347535306 +0000 UTC Jan 26 00:07:17 crc kubenswrapper[4874]: I0126 00:07:17.119405 4874 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 26 00:07:17 crc kubenswrapper[4874]: I0126 00:07:17.119854 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 26 00:07:17 crc kubenswrapper[4874]: E0126 00:07:17.508009 4874 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 26 00:07:17 crc kubenswrapper[4874]: I0126 00:07:17.541114 4874 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 26 00:07:17 crc kubenswrapper[4874]: I0126 00:07:17.551430 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 12:51:42.478867601 +0000 UTC Jan 26 00:07:18 crc kubenswrapper[4874]: W0126 00:07:18.090811 4874 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.090947 4874 trace.go:236] Trace[1195699436]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Jan-2026 00:07:08.088) (total time: 10002ms): Jan 26 00:07:18 crc kubenswrapper[4874]: Trace[1195699436]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:18.090) Jan 26 00:07:18 crc kubenswrapper[4874]: Trace[1195699436]: [10.002016104s] [10.002016104s] END Jan 26 00:07:18 crc kubenswrapper[4874]: E0126 00:07:18.090993 4874 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.374663 4874 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.375093 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.380947 4874 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.381276 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.551820 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 22:49:07.81686435 +0000 UTC Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.836534 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.836705 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.838033 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.838325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:18 crc kubenswrapper[4874]: I0126 00:07:18.838337 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:19 crc kubenswrapper[4874]: I0126 00:07:19.552331 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 17:55:37.670069119 +0000 UTC Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.446198 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.446441 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.447893 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.448009 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.448029 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.452242 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.553477 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 11:42:55.694881956 +0000 UTC Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.731099 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.732373 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.732434 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:20 crc kubenswrapper[4874]: I0126 00:07:20.732475 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:21 crc kubenswrapper[4874]: I0126 00:07:21.553812 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 15:36:46.779031656 +0000 UTC Jan 26 00:07:21 crc kubenswrapper[4874]: I0126 00:07:21.700855 4874 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 26 00:07:21 crc kubenswrapper[4874]: I0126 00:07:21.746395 4874 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 26 00:07:21 crc kubenswrapper[4874]: I0126 00:07:21.766250 4874 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.508549 4874 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.508673 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.540374 4874 apiserver.go:52] "Watching apiserver" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.544625 4874 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.545092 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.545545 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.545742 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.545840 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:22 crc kubenswrapper[4874]: E0126 00:07:22.545924 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.546359 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:22 crc kubenswrapper[4874]: E0126 00:07:22.546631 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.546656 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.546816 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:22 crc kubenswrapper[4874]: E0126 00:07:22.547159 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.549090 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.549340 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.549093 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.549154 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.549694 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.550410 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.550514 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.550565 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.551028 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.553918 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:50:57.67075935 +0000 UTC Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.610234 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.629080 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.646694 4874 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.648380 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.658369 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.670583 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.682354 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.691811 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.700850 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.719676 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.733106 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.746121 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.753717 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.755366 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.763678 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.782455 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.796580 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.814368 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.830469 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.848063 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.865286 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.879764 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.892924 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.906238 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.932556 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:22 crc kubenswrapper[4874]: I0126 00:07:22.948921 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.384045 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.386410 4874 trace.go:236] Trace[436298933]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Jan-2026 00:07:09.359) (total time: 14026ms): Jan 26 00:07:23 crc kubenswrapper[4874]: Trace[436298933]: ---"Objects listed" error: 14026ms (00:07:23.386) Jan 26 00:07:23 crc kubenswrapper[4874]: Trace[436298933]: [14.026822373s] [14.026822373s] END Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.386449 4874 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.386795 4874 trace.go:236] Trace[1842764735]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Jan-2026 00:07:09.220) (total time: 14166ms): Jan 26 00:07:23 crc kubenswrapper[4874]: Trace[1842764735]: ---"Objects listed" error: 14166ms (00:07:23.386) Jan 26 00:07:23 crc kubenswrapper[4874]: Trace[1842764735]: [14.16644429s] [14.16644429s] END Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.386832 4874 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.389208 4874 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.389922 4874 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.390848 4874 trace.go:236] Trace[1343039423]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Jan-2026 00:07:08.946) (total time: 14444ms): Jan 26 00:07:23 crc kubenswrapper[4874]: Trace[1343039423]: ---"Objects listed" error: 14444ms (00:07:23.390) Jan 26 00:07:23 crc kubenswrapper[4874]: Trace[1343039423]: [14.444151267s] [14.444151267s] END Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.390868 4874 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490284 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490357 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490396 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490431 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490464 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490494 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490525 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490564 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490598 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490626 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490732 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490778 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490816 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490861 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490899 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.490941 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491037 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491354 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491485 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491570 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491579 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491628 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491724 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491697 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491853 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.491888 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492345 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492449 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492592 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492454 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492468 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492593 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492652 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492700 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492759 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492780 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492778 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492888 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.492950 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493002 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493042 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493110 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493156 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493203 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493248 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493290 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493332 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493339 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493388 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493435 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493481 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493517 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493548 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493581 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493673 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493710 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493743 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493776 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493808 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493840 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493872 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493901 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493931 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493990 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494036 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494069 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494099 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494131 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494166 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494198 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494262 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494294 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494331 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494367 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494401 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494436 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494470 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493329 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494491 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494503 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494542 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494574 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494618 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494772 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494813 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494850 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494883 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494915 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494948 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495497 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495571 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495637 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495686 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495720 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495765 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495811 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495856 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495908 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495951 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496038 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496085 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496130 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496177 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496228 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493379 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.493395 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494120 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494323 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494400 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494705 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496276 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496329 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496384 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496432 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496483 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496532 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496584 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496630 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496676 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496727 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496775 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496822 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496864 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496909 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497007 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497058 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497108 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497157 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497200 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497239 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497273 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497309 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497343 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497376 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497408 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497447 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497481 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497517 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497565 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497599 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497634 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497675 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497709 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497742 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497775 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497817 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497852 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497884 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497918 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.497955 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498021 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498056 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498092 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498126 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498161 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498194 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498257 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498291 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498325 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498359 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498393 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498432 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498466 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498503 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498540 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498573 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498610 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498644 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498678 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498716 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498751 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498786 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498821 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498858 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498891 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498924 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498989 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499035 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499079 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499120 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499163 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499202 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499238 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499276 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499316 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499355 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499393 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499429 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499467 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499503 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499539 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499575 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499628 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499687 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499726 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499763 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499812 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499851 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499892 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499934 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500002 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500045 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500087 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500128 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500166 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500208 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500247 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500283 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500325 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500369 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500406 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500441 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500481 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500521 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500559 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500597 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500637 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500679 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500723 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500769 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500841 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500894 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500939 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501030 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501073 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501121 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501162 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501234 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501281 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501324 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501367 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501409 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501452 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501492 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501581 4874 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501608 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501631 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501656 4874 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501680 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501703 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501726 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501748 4874 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501770 4874 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501791 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501813 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501834 4874 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501855 4874 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501879 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501902 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501923 4874 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501946 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502009 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502031 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494748 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494939 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.494946 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495014 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495086 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.495698 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496249 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496267 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496493 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496600 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496559 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496812 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.496911 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.498767 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499582 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499751 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.499948 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500036 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500248 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500131 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.500623 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.501732 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502258 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502407 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502408 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502477 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502555 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.502610 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:07:24.002586947 +0000 UTC m=+18.736693484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502626 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.502803 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503084 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503081 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503117 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503270 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503346 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503523 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503576 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.503926 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.504103 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.504236 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.504409 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.504539 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.504622 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.505548 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.505563 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.505620 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.505705 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.506118 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.506279 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.506402 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.506452 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.506781 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.507273 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.507026 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.507611 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.507767 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.507671 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.507931 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508033 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508230 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508259 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508318 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508602 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508844 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508938 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509167 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509138 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509240 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509309 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509508 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509564 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509710 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509707 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.509951 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.510108 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.510335 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.510588 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.510497 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.510767 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.511008 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.511170 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.510213 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.511556 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.511580 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.511878 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512143 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512330 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512337 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512368 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512548 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512727 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512769 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512779 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512842 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512881 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.512897 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.513013 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.513088 4874 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44078->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.514496 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44078->192.168.126.11:17697: read: connection reset by peer" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.514508 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.514695 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.513217 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.513644 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.513866 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.513889 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.514176 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.514733 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.515305 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.515379 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.515418 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.515632 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.514673 4874 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.515779 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.513211 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.516010 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.516308 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.516650 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.516921 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.517136 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.517347 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.517544 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.517557 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.517766 4874 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.518976 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.519372 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.519669 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.519675 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.518542 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.518608 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.518636 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.519913 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.519978 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.520083 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.520284 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.521436 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.518507 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.518412 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.521800 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:24.021774247 +0000 UTC m=+18.755880794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.522155 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.522336 4874 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.522399 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.522612 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.522636 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.522669 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:24.02265328 +0000 UTC m=+18.756759827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.522705 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.522990 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.523050 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.523224 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.523425 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.523539 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.527021 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.529493 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.529567 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.523718 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.529673 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.529703 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.529941 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.536850 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.508260 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.538201 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.538204 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.538299 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.538315 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.538498 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.538677 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.538236 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.539382 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:24.039363613 +0000 UTC m=+18.773470150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.539278 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.542283 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.542744 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.542999 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.543527 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.543835 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.543824 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.544046 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.544144 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.544367 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.544405 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.544444 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.544562 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.544808 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.545325 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.545490 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.545939 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.549766 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.547818 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.550413 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:23 crc kubenswrapper[4874]: E0126 00:07:23.550653 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:24.050630807 +0000 UTC m=+18.784737354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.549131 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.549718 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.546076 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.553070 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.555709 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 21:02:00.682837867 +0000 UTC Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.561453 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.573243 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.584214 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.599049 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.603614 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.603830 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.603949 4874 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604039 4874 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604105 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.603825 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604161 4874 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604247 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604259 4874 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604269 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604278 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604288 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604297 4874 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604316 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604325 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604333 4874 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604342 4874 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604351 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604361 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604371 4874 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604380 4874 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604388 4874 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604397 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604406 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604415 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604423 4874 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604431 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604440 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604452 4874 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604461 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604469 4874 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604478 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604487 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604496 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604504 4874 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604513 4874 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604522 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604533 4874 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604543 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604552 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604561 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604569 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604578 4874 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604586 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604595 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604604 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604613 4874 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604622 4874 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604630 4874 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604639 4874 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604648 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604658 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604668 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604676 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604686 4874 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604694 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604703 4874 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604711 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604719 4874 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604727 4874 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604736 4874 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604745 4874 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604762 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604771 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604778 4874 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604788 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604796 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604804 4874 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604811 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604819 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604827 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604835 4874 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604843 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604852 4874 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604860 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604869 4874 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604879 4874 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604887 4874 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604895 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604904 4874 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604913 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604920 4874 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604930 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604939 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604948 4874 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604975 4874 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604984 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.604994 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605009 4874 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605018 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605026 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605034 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605043 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605052 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605061 4874 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605070 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605079 4874 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605087 4874 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605095 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.603950 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605103 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605273 4874 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605296 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605311 4874 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605349 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605365 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605378 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605422 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605439 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605451 4874 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605465 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605503 4874 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605518 4874 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605532 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605546 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605584 4874 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605600 4874 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605612 4874 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605627 4874 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605666 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605683 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605697 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605710 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605723 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605804 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605842 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605854 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605866 4874 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605916 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605928 4874 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605938 4874 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605948 4874 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605986 4874 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.605997 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606008 4874 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606019 4874 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606030 4874 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606040 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606052 4874 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606063 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606073 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606084 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606094 4874 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606105 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606115 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606127 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606165 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606177 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606189 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606201 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606213 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606225 4874 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606239 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606252 4874 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606265 4874 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606276 4874 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606288 4874 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606298 4874 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606311 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606322 4874 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606333 4874 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606346 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606358 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606370 4874 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606389 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606402 4874 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606414 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606427 4874 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606439 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606451 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.606463 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.607174 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.607194 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.607207 4874 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.607246 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.607258 4874 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.607271 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.607283 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.613823 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.627242 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.628135 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.629213 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.629782 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.630927 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.631604 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.632424 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.633413 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.634048 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.635089 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.635655 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.636692 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.637207 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.637692 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.638561 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.639060 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.640002 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.640405 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.640942 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.641948 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.642431 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.643340 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.643831 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.644923 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.645358 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.646194 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.647371 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.647876 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.649108 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.649727 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.650686 4874 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.650784 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.652647 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.653516 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.653918 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.655537 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.656159 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.657040 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.657639 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.658641 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.659085 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.660031 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.660656 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.661611 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.662204 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.663072 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.663587 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.664653 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.665116 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.666042 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.666512 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.667492 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.668154 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.668584 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.707644 4874 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.740916 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.743892 4874 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f" exitCode=255 Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.743989 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f"} Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.753624 4874 scope.go:117] "RemoveContainer" containerID="7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.753775 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.754321 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.766468 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.769502 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.780576 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.784195 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 26 00:07:23 crc kubenswrapper[4874]: W0126 00:07:23.787109 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a1869de98414269fd0ab405ac5ffefe0eafcd80ed5de4b758ab003a9aaa5e92b WatchSource:0}: Error finding container a1869de98414269fd0ab405ac5ffefe0eafcd80ed5de4b758ab003a9aaa5e92b: Status 404 returned error can't find the container with id a1869de98414269fd0ab405ac5ffefe0eafcd80ed5de4b758ab003a9aaa5e92b Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.793604 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: W0126 00:07:23.797296 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-0ef2c1361429877a89d5367e1314fe87a541f9481bfc3f0ee2d5ba917f635193 WatchSource:0}: Error finding container 0ef2c1361429877a89d5367e1314fe87a541f9481bfc3f0ee2d5ba917f635193: Status 404 returned error can't find the container with id 0ef2c1361429877a89d5367e1314fe87a541f9481bfc3f0ee2d5ba917f635193 Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.804025 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.813464 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: W0126 00:07:23.820505 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-286fde95bdcfc356fee8d24f1320fa0405e619ec06fbc5b044e3de24c5a18e6d WatchSource:0}: Error finding container 286fde95bdcfc356fee8d24f1320fa0405e619ec06fbc5b044e3de24c5a18e6d: Status 404 returned error can't find the container with id 286fde95bdcfc356fee8d24f1320fa0405e619ec06fbc5b044e3de24c5a18e6d Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.825055 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:23 crc kubenswrapper[4874]: I0126 00:07:23.836297 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.014083 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.014291 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:07:25.014265993 +0000 UTC m=+19.748372530 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.114643 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.114704 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.114734 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.114756 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.114878 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.114876 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.114951 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115020 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:25.114997979 +0000 UTC m=+19.849104516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.114898 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115044 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:25.11503426 +0000 UTC m=+19.849140887 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115074 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115074 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115099 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115111 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115166 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:25.115144023 +0000 UTC m=+19.849250610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.115205 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:25.115177014 +0000 UTC m=+19.849283591 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.556926 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 13:27:40.978029541 +0000 UTC Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.619713 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.619798 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.619915 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.619943 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.620163 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:24 crc kubenswrapper[4874]: E0126 00:07:24.620285 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.748903 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506"} Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.748977 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543"} Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.748991 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0ef2c1361429877a89d5367e1314fe87a541f9481bfc3f0ee2d5ba917f635193"} Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.751267 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f"} Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.751298 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a1869de98414269fd0ab405ac5ffefe0eafcd80ed5de4b758ab003a9aaa5e92b"} Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.754033 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.756118 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811"} Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.756637 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.757844 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"286fde95bdcfc356fee8d24f1320fa0405e619ec06fbc5b044e3de24c5a18e6d"} Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.772241 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.786727 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.803089 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.814126 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.825455 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.840644 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.859076 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.893196 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.916452 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.954087 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.969101 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:24 crc kubenswrapper[4874]: I0126 00:07:24.991529 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.019597 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.024145 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.024329 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:07:27.024294704 +0000 UTC m=+21.758401251 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.034827 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.049087 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.067351 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.124711 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.124762 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.124787 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.124814 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.124923 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125005 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:27.124967619 +0000 UTC m=+21.859091216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125065 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125121 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125140 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125224 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:27.125195465 +0000 UTC m=+21.859302052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125290 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125327 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:27.125314908 +0000 UTC m=+21.859421565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125451 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125480 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125499 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:25 crc kubenswrapper[4874]: E0126 00:07:25.125549 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:27.125535924 +0000 UTC m=+21.859642531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.557375 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 14:23:25.302709078 +0000 UTC Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.650303 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.681421 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.717390 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.742868 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.754822 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.765849 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.775445 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:25 crc kubenswrapper[4874]: I0126 00:07:25.785750 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.558413 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:08:31.622015571 +0000 UTC Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.590047 4874 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.591426 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.591460 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.591468 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.591524 4874 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.599547 4874 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.599890 4874 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.601403 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.601451 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.601467 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.601502 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.601521 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.620017 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.620057 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.620234 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.620440 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.620560 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.620679 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.626893 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.631141 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.631188 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.631235 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.631257 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.631273 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.648570 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.653694 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.653742 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.653756 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.653777 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.653791 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.670198 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.674710 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.674745 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.674755 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.674774 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.674787 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.689524 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.693535 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.693663 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.693731 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.693796 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.693853 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.705392 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:26 crc kubenswrapper[4874]: E0126 00:07:26.705695 4874 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.707280 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.707324 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.707340 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.707364 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.707381 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.810841 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.810898 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.810915 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.810939 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.810988 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.913615 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.913653 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.913662 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.913677 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:26 crc kubenswrapper[4874]: I0126 00:07:26.913688 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:26Z","lastTransitionTime":"2026-01-26T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.016735 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.016792 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.016811 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.016840 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.016862 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.043400 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.043555 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.043527753 +0000 UTC m=+25.777634290 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.119667 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.119717 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.119728 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.119747 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.119760 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.144484 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.144552 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.144594 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.144633 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144774 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144801 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144815 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144874 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.144854875 +0000 UTC m=+25.878961412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144866 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144775 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144932 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144949 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.145038 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.144998069 +0000 UTC m=+25.879104656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.145078 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.145059771 +0000 UTC m=+25.879166448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.144654 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: E0126 00:07:27.145282 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.145268616 +0000 UTC m=+25.879375143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.222745 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.222804 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.222821 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.222850 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.222867 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.326088 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.326162 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.326182 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.326213 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.326238 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.429203 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.429282 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.429306 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.429339 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.429363 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.532265 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.532339 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.532362 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.532391 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.532412 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.559183 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:42:34.040202075 +0000 UTC Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.636437 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.636490 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.636511 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.636538 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.636558 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.739567 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.739645 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.739671 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.739701 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.739726 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.768509 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.786260 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.808208 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.843549 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.843632 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.843651 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.843689 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.843777 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.846574 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.869949 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.891390 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.907953 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.920819 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.936747 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.946858 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.946912 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.946929 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.946954 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:27 crc kubenswrapper[4874]: I0126 00:07:27.946997 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:27Z","lastTransitionTime":"2026-01-26T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.050744 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.050820 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.050845 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.050875 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.050898 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.154725 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.154810 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.154834 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.154865 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.154893 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.258027 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.258062 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.258072 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.258088 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.258100 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.361287 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.361326 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.361339 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.361357 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.361369 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.464281 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.464342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.464360 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.464387 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.464405 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.559466 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 22:20:12.42571524 +0000 UTC Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.567318 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.567361 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.567378 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.567406 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.567428 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.620285 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.620334 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.620285 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:28 crc kubenswrapper[4874]: E0126 00:07:28.620467 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:28 crc kubenswrapper[4874]: E0126 00:07:28.620667 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:28 crc kubenswrapper[4874]: E0126 00:07:28.620878 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.670222 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.670275 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.670292 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.670317 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.670335 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.771838 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.771891 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.771907 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.771929 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.771945 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.874540 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.874585 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.874598 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.874616 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.874625 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.889161 4874 csr.go:261] certificate signing request csr-2gx4t is approved, waiting to be issued Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.894420 4874 csr.go:257] certificate signing request csr-2gx4t is issued Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.976757 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.976811 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.976819 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.976835 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:28 crc kubenswrapper[4874]: I0126 00:07:28.976843 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:28Z","lastTransitionTime":"2026-01-26T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.080789 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.080824 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.080833 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.080848 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.080858 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.183120 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.183162 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.183172 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.183190 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.183200 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.285393 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.285447 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.285458 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.285475 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.285487 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.387815 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.387847 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.387858 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.387874 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.387883 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.490431 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.490472 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.490481 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.490499 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.490508 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.510719 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.516523 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.528083 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.553788 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.559851 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 03:09:29.660616728 +0000 UTC Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.573082 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.578133 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.592111 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.593242 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.593302 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.593318 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.593342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.593356 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.644698 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.664514 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.683165 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.695441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.695485 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.695494 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.695509 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.695522 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.699423 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.715755 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.728629 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.744346 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.763056 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ffp42"] Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.763401 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.764069 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dvmzv"] Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.764722 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qjn7c"] Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.764865 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.764939 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.764996 4874 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.765198 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-m5wsd"] Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.765459 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-q4qrv"] Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.772119 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.772601 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773066 4874 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773081 4874 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773103 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773112 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773206 4874 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773225 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773233 4874 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773269 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773278 4874 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773293 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773298 4874 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773315 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773342 4874 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773360 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773408 4874 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773422 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773454 4874 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773475 4874 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773493 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773475 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773557 4874 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773572 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.773588 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773695 4874 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773720 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.773668 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.773899 4874 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.773918 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777209 4874 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777250 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777270 4874 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777279 4874 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777301 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777321 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777332 4874 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777363 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777407 4874 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777431 4874 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777448 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777427 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777512 4874 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777558 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: W0126 00:07:29.777623 4874 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Jan 26 00:07:29 crc kubenswrapper[4874]: E0126 00:07:29.777648 4874 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.799202 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.799241 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.799250 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.799266 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.799278 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.801742 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.814243 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.824946 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.835209 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.844271 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.854409 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.864462 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.874495 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.875789 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-kubelet\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.875830 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-systemd\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.875862 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6vr\" (UniqueName: \"kubernetes.io/projected/b1fd285d-2d8d-4a57-8afa-450025153e32-kube-api-access-bh6vr\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.875940 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-etc-kubernetes\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.875994 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876023 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-config\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876095 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876690 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-binary-copy\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876741 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876780 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-log-socket\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876806 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-ovn-kubernetes\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876835 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-daemon-config\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876858 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876887 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-netd\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876932 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-cni-multus\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.876988 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpcv\" (UniqueName: \"kubernetes.io/projected/23d0d511-4c29-434d-92b1-74ca6909e53f-kube-api-access-xqpcv\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877008 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-k8s-cni-cncf-io\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877026 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-conf-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877043 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/30567d80-b29a-4ecb-b39a-d97b81664dee-hosts-file\") pod \"node-resolver-m5wsd\" (UID: \"30567d80-b29a-4ecb-b39a-d97b81664dee\") " pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877073 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-cni-binary-copy\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877089 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-etc-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877123 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1fd285d-2d8d-4a57-8afa-450025153e32-proxy-tls\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877144 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-bin\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877177 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-cnibin\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877203 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-os-release\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877224 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877250 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-system-cni-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877273 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-cni-bin\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877297 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-system-cni-dir\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877317 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7tn\" (UniqueName: \"kubernetes.io/projected/30567d80-b29a-4ecb-b39a-d97b81664dee-kube-api-access-zq7tn\") pod \"node-resolver-m5wsd\" (UID: \"30567d80-b29a-4ecb-b39a-d97b81664dee\") " pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877336 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-netns\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877353 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-netns\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877389 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1fd285d-2d8d-4a57-8afa-450025153e32-mcd-auth-proxy-config\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877407 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-cni-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877422 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5zz\" (UniqueName: \"kubernetes.io/projected/f3d49871-449f-49f4-a40a-121286b1e929-kube-api-access-xg5zz\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877440 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-socket-dir-parent\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877455 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-slash\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877471 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-node-log\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877510 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-systemd-units\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877535 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-kubelet\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877556 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-multus-certs\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877595 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-var-lib-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877615 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-env-overrides\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877629 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hgl\" (UniqueName: \"kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877651 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b1fd285d-2d8d-4a57-8afa-450025153e32-rootfs\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877689 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-os-release\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877714 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-hostroot\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877733 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-ovn\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877761 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-script-lib\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.877799 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-cnibin\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.887350 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.895278 4874 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-26 00:02:28 +0000 UTC, rotation deadline is 2026-12-02 15:22:01.638230574 +0000 UTC Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.895317 4874 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7455h14m31.742917762s for next certificate rotation Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.900505 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.901156 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.901205 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.901216 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.901232 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.901245 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:29Z","lastTransitionTime":"2026-01-26T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.924040 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.943217 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.957677 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.971949 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979216 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-cni-multus\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979251 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-daemon-config\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979272 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979287 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-netd\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979309 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpcv\" (UniqueName: \"kubernetes.io/projected/23d0d511-4c29-434d-92b1-74ca6909e53f-kube-api-access-xqpcv\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979324 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-k8s-cni-cncf-io\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979338 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-conf-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979353 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/30567d80-b29a-4ecb-b39a-d97b81664dee-hosts-file\") pod \"node-resolver-m5wsd\" (UID: \"30567d80-b29a-4ecb-b39a-d97b81664dee\") " pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979377 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-cni-binary-copy\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979394 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-etc-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979412 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1fd285d-2d8d-4a57-8afa-450025153e32-proxy-tls\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979427 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-bin\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979461 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-cnibin\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979478 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-os-release\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979494 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979512 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-system-cni-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979526 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-cni-bin\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979542 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-system-cni-dir\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979558 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7tn\" (UniqueName: \"kubernetes.io/projected/30567d80-b29a-4ecb-b39a-d97b81664dee-kube-api-access-zq7tn\") pod \"node-resolver-m5wsd\" (UID: \"30567d80-b29a-4ecb-b39a-d97b81664dee\") " pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979579 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1fd285d-2d8d-4a57-8afa-450025153e32-mcd-auth-proxy-config\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979594 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-cni-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979610 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-netns\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979624 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-netns\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979639 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-socket-dir-parent\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979655 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-slash\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979669 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-node-log\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979685 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5zz\" (UniqueName: \"kubernetes.io/projected/f3d49871-449f-49f4-a40a-121286b1e929-kube-api-access-xg5zz\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979703 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-kubelet\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979719 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-systemd-units\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979734 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b1fd285d-2d8d-4a57-8afa-450025153e32-rootfs\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979749 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-multus-certs\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979764 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-var-lib-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979780 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-env-overrides\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979794 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52hgl\" (UniqueName: \"kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979814 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-cnibin\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979828 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-os-release\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979841 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-hostroot\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979855 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-ovn\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979870 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-script-lib\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979885 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-kubelet\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979905 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-systemd\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979922 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6vr\" (UniqueName: \"kubernetes.io/projected/b1fd285d-2d8d-4a57-8afa-450025153e32-kube-api-access-bh6vr\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979937 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-etc-kubernetes\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.979987 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-log-socket\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980010 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-ovn-kubernetes\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980025 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980040 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-config\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980054 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980069 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-binary-copy\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980096 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980176 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-etc-kubernetes\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980196 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-k8s-cni-cncf-io\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980214 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-bin\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980232 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-cni-multus\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980220 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-os-release\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980257 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-hostroot\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980267 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-conf-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980326 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980396 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-ovn\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980408 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-slash\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980420 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980453 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-system-cni-dir\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980459 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-ovn-kubernetes\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980480 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-var-lib-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980464 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-system-cni-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980481 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-systemd\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980498 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-netns\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980516 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-netns\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980526 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b1fd285d-2d8d-4a57-8afa-450025153e32-rootfs\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980575 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-run-multus-certs\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980595 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-os-release\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980631 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-node-log\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980632 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-kubelet\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980647 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-netd\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980657 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-etc-openvswitch\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980672 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-cnibin\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980675 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/30567d80-b29a-4ecb-b39a-d97b81664dee-hosts-file\") pod \"node-resolver-m5wsd\" (UID: \"30567d80-b29a-4ecb-b39a-d97b81664dee\") " pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980719 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-cni-dir\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980724 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-host-var-lib-cni-bin\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980745 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-kubelet\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980748 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-systemd-units\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980760 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-log-socket\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980780 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-cnibin\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980135 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-socket-dir-parent\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.980835 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3d49871-449f-49f4-a40a-121286b1e929-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:29 crc kubenswrapper[4874]: I0126 00:07:29.990747 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.002185 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.003909 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.003929 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.003938 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.003982 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.003998 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.011798 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.027685 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.038309 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.106047 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.106080 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.106088 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.106101 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.106111 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.208598 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.208665 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.208683 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.208709 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.208726 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.311050 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.311090 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.311102 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.311117 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.311128 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.413529 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.413566 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.413576 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.413591 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.413601 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.523380 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.523423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.523437 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.523454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.523667 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.560228 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 15:34:06.876963899 +0000 UTC Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.619352 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.619406 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.619352 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.619595 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.619461 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.619759 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.620146 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.626639 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.626900 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.627039 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.627160 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.627295 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.635811 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.641607 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1fd285d-2d8d-4a57-8afa-450025153e32-mcd-auth-proxy-config\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.649376 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.652712 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-env-overrides\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.693854 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.709076 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpcv\" (UniqueName: \"kubernetes.io/projected/23d0d511-4c29-434d-92b1-74ca6909e53f-kube-api-access-xqpcv\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.709480 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5zz\" (UniqueName: \"kubernetes.io/projected/f3d49871-449f-49f4-a40a-121286b1e929-kube-api-access-xg5zz\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.730195 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.730243 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.730256 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.730277 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.730291 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.749199 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.751540 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-config\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.803437 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.816133 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.833094 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.833207 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.833275 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.833336 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.833396 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.898624 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.901529 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-script-lib\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.935719 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.936040 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.936156 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.936263 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:30 crc kubenswrapper[4874]: I0126 00:07:30.936370 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:30Z","lastTransitionTime":"2026-01-26T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.980905 4874 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.980975 4874 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981013 4874 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981023 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-daemon-config podName:23d0d511-4c29-434d-92b1-74ca6909e53f nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.480999269 +0000 UTC m=+26.215105796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-daemon-config") pod "multus-qjn7c" (UID: "23d0d511-4c29-434d-92b1-74ca6909e53f") : failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981110 4874 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-node-metrics-cert: failed to sync secret cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981143 4874 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981148 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1fd285d-2d8d-4a57-8afa-450025153e32-proxy-tls podName:b1fd285d-2d8d-4a57-8afa-450025153e32 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.481099032 +0000 UTC m=+26.215205669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b1fd285d-2d8d-4a57-8afa-450025153e32-proxy-tls") pod "machine-config-daemon-ffp42" (UID: "b1fd285d-2d8d-4a57-8afa-450025153e32") : failed to sync secret cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981122 4874 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981224 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-binary-copy podName:f3d49871-449f-49f4-a40a-121286b1e929 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.481208035 +0000 UTC m=+26.215314682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-binary-copy") pod "multus-additional-cni-plugins-q4qrv" (UID: "f3d49871-449f-49f4-a40a-121286b1e929") : failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981261 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-cni-binary-copy podName:23d0d511-4c29-434d-92b1-74ca6909e53f nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.481244095 +0000 UTC m=+26.215350772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-cni-binary-copy") pod "multus-qjn7c" (UID: "23d0d511-4c29-434d-92b1-74ca6909e53f") : failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981286 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert podName:79a8f534-7005-4a14-ae01-0a689a541502 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.481273876 +0000 UTC m=+26.215380563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-node-metrics-cert" (UniqueName: "kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert") pod "ovnkube-node-dvmzv" (UID: "79a8f534-7005-4a14-ae01-0a689a541502") : failed to sync secret cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.981316 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-sysctl-allowlist podName:f3d49871-449f-49f4-a40a-121286b1e929 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.481303197 +0000 UTC m=+26.215409864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-q4qrv" (UID: "f3d49871-449f-49f4-a40a-121286b1e929") : failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.992057 4874 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.992113 4874 projected.go:194] Error preparing data for projected volume kube-api-access-zq7tn for pod openshift-dns/node-resolver-m5wsd: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.992193 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30567d80-b29a-4ecb-b39a-d97b81664dee-kube-api-access-zq7tn podName:30567d80-b29a-4ecb-b39a-d97b81664dee nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.492166481 +0000 UTC m=+26.226273098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zq7tn" (UniqueName: "kubernetes.io/projected/30567d80-b29a-4ecb-b39a-d97b81664dee-kube-api-access-zq7tn") pod "node-resolver-m5wsd" (UID: "30567d80-b29a-4ecb-b39a-d97b81664dee") : failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:30 crc kubenswrapper[4874]: E0126 00:07:30.994716 4874 projected.go:288] Couldn't get configMap openshift-machine-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.000595 4874 projected.go:288] Couldn't get configMap openshift-ovn-kubernetes/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.001726 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.039127 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.039171 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.039179 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.039195 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.039205 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.050803 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.090340 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.090580 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:07:39.090548463 +0000 UTC m=+33.824655000 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.096763 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.097385 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.134852 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.141637 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.141680 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.141690 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.141709 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.141722 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.171427 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.184538 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.192113 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.192168 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192259 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192316 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:39.192299367 +0000 UTC m=+33.926405904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192325 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.192262 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192365 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:39.192355378 +0000 UTC m=+33.926461915 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.192409 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192488 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192503 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192503 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192515 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192578 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:39.192565864 +0000 UTC m=+33.926672411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192519 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192597 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.192626 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:39.192618505 +0000 UTC m=+33.926725032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.219990 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.227234 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.229789 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.230693 4874 projected.go:194] Error preparing data for projected volume kube-api-access-52hgl for pod openshift-ovn-kubernetes/ovnkube-node-dvmzv: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.230790 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl podName:79a8f534-7005-4a14-ae01-0a689a541502 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.730749297 +0000 UTC m=+26.464855834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-52hgl" (UniqueName: "kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl") pod "ovnkube-node-dvmzv" (UID: "79a8f534-7005-4a14-ae01-0a689a541502") : failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.244628 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.244671 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.244684 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.244703 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.244721 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.249979 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.255427 4874 projected.go:194] Error preparing data for projected volume kube-api-access-bh6vr for pod openshift-machine-config-operator/machine-config-daemon-ffp42: failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:31 crc kubenswrapper[4874]: E0126 00:07:31.255497 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1fd285d-2d8d-4a57-8afa-450025153e32-kube-api-access-bh6vr podName:b1fd285d-2d8d-4a57-8afa-450025153e32 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:31.755477616 +0000 UTC m=+26.489584153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bh6vr" (UniqueName: "kubernetes.io/projected/b1fd285d-2d8d-4a57-8afa-450025153e32-kube-api-access-bh6vr") pod "machine-config-daemon-ffp42" (UID: "b1fd285d-2d8d-4a57-8afa-450025153e32") : failed to sync configmap cache: timed out waiting for the condition Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.287344 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.303370 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.347261 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.347308 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.347319 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.347338 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.347351 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.348043 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.450311 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.450358 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.450372 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.450389 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.450400 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.494830 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-cni-binary-copy\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.494885 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1fd285d-2d8d-4a57-8afa-450025153e32-proxy-tls\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.494909 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7tn\" (UniqueName: \"kubernetes.io/projected/30567d80-b29a-4ecb-b39a-d97b81664dee-kube-api-access-zq7tn\") pod \"node-resolver-m5wsd\" (UID: \"30567d80-b29a-4ecb-b39a-d97b81664dee\") " pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.494989 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.495007 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-binary-copy\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.495023 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.495041 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-daemon-config\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.495729 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-multus-daemon-config\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.496128 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23d0d511-4c29-434d-92b1-74ca6909e53f-cni-binary-copy\") pod \"multus-qjn7c\" (UID: \"23d0d511-4c29-434d-92b1-74ca6909e53f\") " pod="openshift-multus/multus-qjn7c" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.496135 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-binary-copy\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.496466 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f3d49871-449f-49f4-a40a-121286b1e929-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q4qrv\" (UID: \"f3d49871-449f-49f4-a40a-121286b1e929\") " pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.499002 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7tn\" (UniqueName: \"kubernetes.io/projected/30567d80-b29a-4ecb-b39a-d97b81664dee-kube-api-access-zq7tn\") pod \"node-resolver-m5wsd\" (UID: \"30567d80-b29a-4ecb-b39a-d97b81664dee\") " pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.501837 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.502788 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1fd285d-2d8d-4a57-8afa-450025153e32-proxy-tls\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.553160 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.553431 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.553513 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.553662 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.553758 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.560409 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 09:46:14.95166752 +0000 UTC Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.600519 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qjn7c" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.607805 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m5wsd" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.613558 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" Jan 26 00:07:31 crc kubenswrapper[4874]: W0126 00:07:31.623330 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23d0d511_4c29_434d_92b1_74ca6909e53f.slice/crio-ad6bd6b0110f456d3b9e3b020ea8f5a9e26b49fad4fbef11bc8bd0a0a7b96bc0 WatchSource:0}: Error finding container ad6bd6b0110f456d3b9e3b020ea8f5a9e26b49fad4fbef11bc8bd0a0a7b96bc0: Status 404 returned error can't find the container with id ad6bd6b0110f456d3b9e3b020ea8f5a9e26b49fad4fbef11bc8bd0a0a7b96bc0 Jan 26 00:07:31 crc kubenswrapper[4874]: W0126 00:07:31.633245 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30567d80_b29a_4ecb_b39a_d97b81664dee.slice/crio-ecc9c4125e0197caa18182d595609a3f32ba3b798701eca5f9b08b09de50f798 WatchSource:0}: Error finding container ecc9c4125e0197caa18182d595609a3f32ba3b798701eca5f9b08b09de50f798: Status 404 returned error can't find the container with id ecc9c4125e0197caa18182d595609a3f32ba3b798701eca5f9b08b09de50f798 Jan 26 00:07:31 crc kubenswrapper[4874]: W0126 00:07:31.634909 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3d49871_449f_49f4_a40a_121286b1e929.slice/crio-fe992b40ebebed9d80a19142c706dd43f45090da0fecf1ed8e61cfa233624959 WatchSource:0}: Error finding container fe992b40ebebed9d80a19142c706dd43f45090da0fecf1ed8e61cfa233624959: Status 404 returned error can't find the container with id fe992b40ebebed9d80a19142c706dd43f45090da0fecf1ed8e61cfa233624959 Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.659979 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.660019 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.660033 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.660057 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.660069 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.763275 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.763314 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.763325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.763342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.763353 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.781289 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerStarted","Data":"fe992b40ebebed9d80a19142c706dd43f45090da0fecf1ed8e61cfa233624959"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.782634 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m5wsd" event={"ID":"30567d80-b29a-4ecb-b39a-d97b81664dee","Type":"ContainerStarted","Data":"ecc9c4125e0197caa18182d595609a3f32ba3b798701eca5f9b08b09de50f798"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.785242 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerStarted","Data":"78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.785320 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerStarted","Data":"ad6bd6b0110f456d3b9e3b020ea8f5a9e26b49fad4fbef11bc8bd0a0a7b96bc0"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.798406 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52hgl\" (UniqueName: \"kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.798456 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6vr\" (UniqueName: \"kubernetes.io/projected/b1fd285d-2d8d-4a57-8afa-450025153e32-kube-api-access-bh6vr\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.802460 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hgl\" (UniqueName: \"kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl\") pod \"ovnkube-node-dvmzv\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.802634 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6vr\" (UniqueName: \"kubernetes.io/projected/b1fd285d-2d8d-4a57-8afa-450025153e32-kube-api-access-bh6vr\") pod \"machine-config-daemon-ffp42\" (UID: \"b1fd285d-2d8d-4a57-8afa-450025153e32\") " pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.803587 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.816053 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.826356 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.837647 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.857704 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.865877 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.865905 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.865914 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.865928 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.865937 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.871543 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.883541 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.885398 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.894348 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:31 crc kubenswrapper[4874]: W0126 00:07:31.898170 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1fd285d_2d8d_4a57_8afa_450025153e32.slice/crio-c31153c29515cab5796c3a3b35ed85336764ee78a17d0b3778ce80993913ad34 WatchSource:0}: Error finding container c31153c29515cab5796c3a3b35ed85336764ee78a17d0b3778ce80993913ad34: Status 404 returned error can't find the container with id c31153c29515cab5796c3a3b35ed85336764ee78a17d0b3778ce80993913ad34 Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.904248 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.918017 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.932141 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.945084 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.961318 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.968161 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.968201 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.968217 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.968238 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.968253 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:31Z","lastTransitionTime":"2026-01-26T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.977122 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:31 crc kubenswrapper[4874]: I0126 00:07:31.990493 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.070826 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.070868 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.070879 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.070897 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.070908 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.173403 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.173433 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.173441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.173454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.173465 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.275752 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.275782 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.275790 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.275803 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.275814 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.378376 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.378440 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.378462 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.378493 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.378513 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.481908 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.482200 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.482208 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.482222 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.482230 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.560842 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 08:17:56.383070042 +0000 UTC Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.584628 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.584869 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.584944 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.585086 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.585182 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.619764 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:32 crc kubenswrapper[4874]: E0126 00:07:32.619987 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.620504 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:32 crc kubenswrapper[4874]: E0126 00:07:32.620614 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.620696 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:32 crc kubenswrapper[4874]: E0126 00:07:32.620786 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.688404 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.688466 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.688492 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.688526 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.688551 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.772324 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mmtqg"] Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.773157 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.775268 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.777068 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.777248 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.777734 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.791345 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.791580 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.791659 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.791733 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.791813 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.792794 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.793699 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c" exitCode=0 Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.793820 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.793869 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"553c8860dba1935ef1d302365108e2e24a0bac9817de49dd9df4618177b49703"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.798511 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.798587 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.798619 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"c31153c29515cab5796c3a3b35ed85336764ee78a17d0b3778ce80993913ad34"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.801402 4874 generic.go:334] "Generic (PLEG): container finished" podID="f3d49871-449f-49f4-a40a-121286b1e929" containerID="e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198" exitCode=0 Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.801475 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerDied","Data":"e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.803498 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m5wsd" event={"ID":"30567d80-b29a-4ecb-b39a-d97b81664dee","Type":"ContainerStarted","Data":"0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.810795 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.835077 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.847975 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.861691 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.883357 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.895337 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.895367 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.895462 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.895471 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.895485 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.895494 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.909830 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d2fb749-294e-42b2-952e-c0f4b1611c06-serviceca\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.910070 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cz8v\" (UniqueName: \"kubernetes.io/projected/2d2fb749-294e-42b2-952e-c0f4b1611c06-kube-api-access-2cz8v\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.910127 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2fb749-294e-42b2-952e-c0f4b1611c06-host\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.910764 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.924516 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.939584 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.952473 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.966501 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.976580 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.993437 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.997923 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.998159 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.998275 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.998406 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:32 crc kubenswrapper[4874]: I0126 00:07:32.998514 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:32Z","lastTransitionTime":"2026-01-26T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.004816 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.011561 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d2fb749-294e-42b2-952e-c0f4b1611c06-serviceca\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.011648 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cz8v\" (UniqueName: \"kubernetes.io/projected/2d2fb749-294e-42b2-952e-c0f4b1611c06-kube-api-access-2cz8v\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.011686 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2fb749-294e-42b2-952e-c0f4b1611c06-host\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.011769 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2fb749-294e-42b2-952e-c0f4b1611c06-host\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.013466 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d2fb749-294e-42b2-952e-c0f4b1611c06-serviceca\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.018331 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.034490 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cz8v\" (UniqueName: \"kubernetes.io/projected/2d2fb749-294e-42b2-952e-c0f4b1611c06-kube-api-access-2cz8v\") pod \"node-ca-mmtqg\" (UID: \"2d2fb749-294e-42b2-952e-c0f4b1611c06\") " pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.038219 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.055396 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.067168 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.083689 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.087984 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mmtqg" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.095479 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.101385 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.101422 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.101433 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.101453 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.101465 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: W0126 00:07:33.103678 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d2fb749_294e_42b2_952e_c0f4b1611c06.slice/crio-8f650589cab315d92dc7adc9eb5dccb9bceeb3b5d5fda3848ea48662cc0610de WatchSource:0}: Error finding container 8f650589cab315d92dc7adc9eb5dccb9bceeb3b5d5fda3848ea48662cc0610de: Status 404 returned error can't find the container with id 8f650589cab315d92dc7adc9eb5dccb9bceeb3b5d5fda3848ea48662cc0610de Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.111508 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.135430 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.168515 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.203486 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.203528 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.203541 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.203559 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.203571 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.217456 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.229392 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.250311 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.270137 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.281066 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.297391 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.306135 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.306179 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.306193 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.306212 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.306227 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.408265 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.408305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.408315 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.408333 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.408345 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.512158 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.512228 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.512244 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.512270 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.512288 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.561711 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 00:08:50.055705356 +0000 UTC Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.615253 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.615326 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.615344 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.615378 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.615397 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.718398 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.718843 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.718859 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.718898 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.718915 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.809253 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mmtqg" event={"ID":"2d2fb749-294e-42b2-952e-c0f4b1611c06","Type":"ContainerStarted","Data":"d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.809718 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mmtqg" event={"ID":"2d2fb749-294e-42b2-952e-c0f4b1611c06","Type":"ContainerStarted","Data":"8f650589cab315d92dc7adc9eb5dccb9bceeb3b5d5fda3848ea48662cc0610de"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.815131 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.815192 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.815204 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.815214 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.815226 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.815237 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.818552 4874 generic.go:334] "Generic (PLEG): container finished" podID="f3d49871-449f-49f4-a40a-121286b1e929" containerID="745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921" exitCode=0 Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.818617 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerDied","Data":"745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.823187 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.823247 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.823303 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.823337 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.823363 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.832408 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.854398 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.872553 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.907061 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.928348 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.928420 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.928445 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.928479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.928498 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:33Z","lastTransitionTime":"2026-01-26T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.930009 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.958906 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:33 crc kubenswrapper[4874]: I0126 00:07:33.981645 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.006418 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.023322 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.032482 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.032538 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.032557 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.032588 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.032607 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.042449 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.058025 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.074185 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.092466 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.108769 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.124710 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.135554 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.135590 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.135601 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.135618 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.135630 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.138204 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.154774 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.173126 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.192227 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.209496 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.223710 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.238316 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.238645 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.238809 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.238957 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.239196 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.240711 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.256204 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.279393 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.294025 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.309736 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.321232 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.340454 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.341638 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.341809 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.341942 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.342160 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.342301 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.353687 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.368172 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.445761 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.446176 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.446385 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.446537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.446696 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.550074 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.550404 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.550537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.550679 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.550790 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.562562 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 05:26:08.204790129 +0000 UTC Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.620119 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.620145 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:34 crc kubenswrapper[4874]: E0126 00:07:34.620651 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.620151 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:34 crc kubenswrapper[4874]: E0126 00:07:34.620785 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:34 crc kubenswrapper[4874]: E0126 00:07:34.621232 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.653694 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.653756 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.653773 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.653802 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.653820 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.756706 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.757126 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.757359 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.757500 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.757629 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.825781 4874 generic.go:334] "Generic (PLEG): container finished" podID="f3d49871-449f-49f4-a40a-121286b1e929" containerID="76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181" exitCode=0 Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.825866 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerDied","Data":"76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.855143 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.861778 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.861840 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.861856 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.861879 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.861894 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.880210 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.901871 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.919017 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.940398 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.959139 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.965796 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.965866 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.965890 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.965921 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.965944 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:34Z","lastTransitionTime":"2026-01-26T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:34 crc kubenswrapper[4874]: I0126 00:07:34.974388 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.006294 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.026667 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.043317 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.060328 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.068667 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.068719 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.068738 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.068886 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.069031 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.080433 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.094356 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.112416 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.125886 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.171027 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.171076 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.171086 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.171102 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.171114 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.274538 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.274611 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.274635 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.274659 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.274675 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.378644 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.378720 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.378738 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.378767 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.378787 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.424526 4874 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.481987 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.482062 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.482082 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.482111 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.482134 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.563202 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 11:20:42.096685157 +0000 UTC Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.585481 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.585553 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.585573 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.585601 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.585619 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.652647 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.676909 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.688260 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.688349 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.688368 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.688396 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.688414 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.699990 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.718306 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.734413 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.757845 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.777014 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.790870 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.790921 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.790935 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.790979 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.790998 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.794278 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.823481 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.834023 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.837260 4874 generic.go:334] "Generic (PLEG): container finished" podID="f3d49871-449f-49f4-a40a-121286b1e929" containerID="ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105" exitCode=0 Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.837321 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerDied","Data":"ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.849928 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.867465 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.883906 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.894731 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.894763 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.894775 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.894794 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.894810 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.903749 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.923918 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.944369 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.961300 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.980817 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.997916 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.998289 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.998337 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.998349 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.998371 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:35 crc kubenswrapper[4874]: I0126 00:07:35.998385 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:35Z","lastTransitionTime":"2026-01-26T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.011152 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.027343 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.039640 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.050531 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.069692 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.082067 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.097848 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.103998 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.104653 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.104734 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.104767 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.104787 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.112655 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.131288 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.141259 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.152078 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.164429 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.207363 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.207420 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.207432 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.207450 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.207462 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.311032 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.311085 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.311103 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.311128 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.311146 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.413820 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.413858 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.413869 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.413885 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.413897 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.517288 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.517364 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.517390 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.517421 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.517444 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.564152 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 11:25:37.596201589 +0000 UTC Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.619540 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.619597 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:36 crc kubenswrapper[4874]: E0126 00:07:36.620138 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.619598 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:36 crc kubenswrapper[4874]: E0126 00:07:36.620438 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:36 crc kubenswrapper[4874]: E0126 00:07:36.620200 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.620723 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.620772 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.620789 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.620816 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.620833 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.725068 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.725425 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.725444 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.725473 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.725495 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.829045 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.829112 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.829129 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.829156 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.829174 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.845476 4874 generic.go:334] "Generic (PLEG): container finished" podID="f3d49871-449f-49f4-a40a-121286b1e929" containerID="25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886" exitCode=0 Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.845534 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerDied","Data":"25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.886276 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.895410 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.895458 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.895476 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.895502 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.895520 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.910936 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: E0126 00:07:36.915601 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.920708 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.920777 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.920796 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.920825 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.920845 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.932193 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: E0126 00:07:36.943907 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.951809 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.952539 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.952607 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.952632 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.952663 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.952685 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.965363 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: E0126 00:07:36.977767 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.981910 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.981952 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.981991 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.982016 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.982035 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:36Z","lastTransitionTime":"2026-01-26T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:36 crc kubenswrapper[4874]: I0126 00:07:36.985155 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:36 crc kubenswrapper[4874]: E0126 00:07:36.998768 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.000618 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.006313 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.006377 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.006414 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.006446 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.006468 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.018370 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: E0126 00:07:37.023361 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: E0126 00:07:37.023506 4874 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.025235 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.025272 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.025285 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.025305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.025318 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.043154 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.056172 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.070044 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.085225 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.097991 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.115266 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.127828 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.127876 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.127888 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.127904 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.127916 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.128509 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.232173 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.232220 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.232230 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.232249 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.232261 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.336567 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.336623 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.336637 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.336656 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.336671 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.440722 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.440793 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.440810 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.440836 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.440858 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.544169 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.544261 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.544285 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.544314 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.544335 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.564763 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:59:37.168708968 +0000 UTC Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.647468 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.647533 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.647562 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.647599 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.647623 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.750810 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.750875 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.750893 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.750919 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.750938 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.854230 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.854305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.854327 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.854353 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.854374 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.857282 4874 generic.go:334] "Generic (PLEG): container finished" podID="f3d49871-449f-49f4-a40a-121286b1e929" containerID="774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4" exitCode=0 Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.857351 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerDied","Data":"774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.878010 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.897001 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.912707 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.946413 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.956687 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.956727 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.956739 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.956756 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.956768 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:37Z","lastTransitionTime":"2026-01-26T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.968787 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:37 crc kubenswrapper[4874]: I0126 00:07:37.990098 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.007075 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.033265 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.049051 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.059484 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.059523 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.059535 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.059555 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.059569 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.073991 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.090872 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.106869 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.122170 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.137364 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.148824 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.162057 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.162096 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.162108 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.162127 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.162137 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.266136 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.266179 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.266188 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.266479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.266501 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.370000 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.370047 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.370057 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.370071 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.370083 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.473781 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.474200 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.474224 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.474277 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.474305 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.565815 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 15:10:58.054576458 +0000 UTC Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.577921 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.578242 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.578331 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.578419 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.578498 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.620310 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.620388 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:38 crc kubenswrapper[4874]: E0126 00:07:38.620464 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.620334 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:38 crc kubenswrapper[4874]: E0126 00:07:38.620550 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:38 crc kubenswrapper[4874]: E0126 00:07:38.620645 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.681022 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.681065 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.681075 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.681090 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.681104 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.784334 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.784378 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.784387 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.784403 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.784415 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.867448 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.868029 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.868210 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.873861 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" event={"ID":"f3d49871-449f-49f4-a40a-121286b1e929","Type":"ContainerStarted","Data":"a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.884636 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.888102 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.888152 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.888168 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.888192 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.888208 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.896694 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.906742 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.907112 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.939099 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.949823 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.969170 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.986812 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.991329 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.991382 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.991396 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.991418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:38 crc kubenswrapper[4874]: I0126 00:07:38.991433 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:38Z","lastTransitionTime":"2026-01-26T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.001904 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.021450 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.042048 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.055923 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.073222 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.086770 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.094469 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.094527 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.094543 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.094567 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.094585 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.104651 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.119305 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.139880 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.155223 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.174498 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.182568 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.182751 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:07:55.182718343 +0000 UTC m=+49.916824910 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.195460 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.196909 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.197018 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.197060 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.197081 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.197095 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.208479 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.221063 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.236613 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.253714 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.271136 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.283640 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.283678 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.283702 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.283732 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.283847 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.283863 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.283873 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.283897 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.284008 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.283919 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:55.283903791 +0000 UTC m=+50.018010328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.284049 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.284068 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.284090 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:55.284058295 +0000 UTC m=+50.018164862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.284105 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.284135 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:55.284112086 +0000 UTC m=+50.018218633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:39 crc kubenswrapper[4874]: E0126 00:07:39.284236 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:55.284215979 +0000 UTC m=+50.018322546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.285173 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.299205 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.299636 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.299679 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.299698 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.299727 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.299745 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.320506 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.339928 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.361847 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.379804 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.403223 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.403304 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.403325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.403354 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.403377 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.413562 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:39Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.506567 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.506625 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.506642 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.506671 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.506688 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.566755 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 02:26:42.761833599 +0000 UTC Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.610286 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.610348 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.610367 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.610392 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.610412 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.714153 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.714229 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.714267 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.714286 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.714298 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.817479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.817530 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.817547 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.817570 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.817587 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.877201 4874 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.919510 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.919548 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.919559 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.919574 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:39 crc kubenswrapper[4874]: I0126 00:07:39.919587 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:39Z","lastTransitionTime":"2026-01-26T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.022381 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.022441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.022452 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.022488 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.022501 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.125272 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.125320 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.125331 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.125355 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.125366 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.231350 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.231399 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.231411 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.231428 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.231440 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.333189 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.333226 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.333235 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.333248 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.333256 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.435812 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.435840 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.435848 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.435861 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.435870 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.537771 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.537800 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.537808 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.537822 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.537831 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.567729 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 01:51:23.760487081 +0000 UTC Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.620357 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:40 crc kubenswrapper[4874]: E0126 00:07:40.620470 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.620827 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:40 crc kubenswrapper[4874]: E0126 00:07:40.620893 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.620955 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:40 crc kubenswrapper[4874]: E0126 00:07:40.621052 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.639732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.639772 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.639782 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.639800 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.639811 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.743449 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.743521 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.743563 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.743582 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.743593 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.846146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.846220 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.846246 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.846278 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.846303 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.881042 4874 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.949290 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.949333 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.949343 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.949360 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:40 crc kubenswrapper[4874]: I0126 00:07:40.949374 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:40Z","lastTransitionTime":"2026-01-26T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.052399 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.052439 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.052448 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.052462 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.052472 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.155834 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.155955 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.156026 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.156054 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.156074 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.259519 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.259582 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.259599 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.259626 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.259646 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.363578 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.363667 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.363699 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.363728 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.363747 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.467486 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.467546 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.467566 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.467601 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.467627 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.567917 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 01:13:40.971874726 +0000 UTC Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.570441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.570491 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.570509 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.570533 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.570551 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.674534 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.674643 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.674666 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.674696 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.674718 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.777482 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.777520 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.777532 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.777551 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.777563 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.880448 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.880550 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.880575 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.880604 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.880622 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.887260 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/0.log" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.891325 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df" exitCode=1 Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.891363 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.891902 4874 scope.go:117] "RemoveContainer" containerID="95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.923474 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:41Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.946770 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:41Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.967781 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:41Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.983677 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.983732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.983748 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.983772 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.983789 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:41Z","lastTransitionTime":"2026-01-26T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:41 crc kubenswrapper[4874]: I0126 00:07:41.991838 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:41Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.019419 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.035182 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.072051 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.086904 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.087243 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.087343 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.087424 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.087496 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.094473 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.108051 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.114355 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.130921 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.171884 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.187989 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.189907 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.189992 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.190019 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.190047 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.190065 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.212045 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.241777 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.276779 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.293769 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.293816 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.293828 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.293865 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.293877 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.307541 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.324816 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.341657 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.366582 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.383260 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.396712 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.396764 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.396780 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.396800 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.396813 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.398514 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.412123 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.436668 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.448814 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.463593 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.476531 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.488431 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.499373 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.499409 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.499418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.499433 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.499443 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.504554 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.519435 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.533081 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.568586 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:05:30.068304758 +0000 UTC Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.601602 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.601640 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.601651 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.601666 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.601680 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.619442 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.619442 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.619569 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:42 crc kubenswrapper[4874]: E0126 00:07:42.619720 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:42 crc kubenswrapper[4874]: E0126 00:07:42.619837 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:42 crc kubenswrapper[4874]: E0126 00:07:42.619923 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.704502 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.704533 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.704541 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.704555 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.704566 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.807661 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.807702 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.807714 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.807734 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.807747 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.897084 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/0.log" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.900189 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.900387 4874 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.913634 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.913703 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.915553 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.915597 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.915610 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:42Z","lastTransitionTime":"2026-01-26T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.921057 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.935043 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.946555 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.965530 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:42 crc kubenswrapper[4874]: I0126 00:07:42.986653 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.010062 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.012331 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr"] Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.013326 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.014719 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.017545 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.018592 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.018642 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.018652 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.018667 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.018677 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.030763 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.052343 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.065865 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.079660 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.092949 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.103810 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.119593 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.121588 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.121615 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.121625 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.121640 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.121650 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.122488 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a83088b-51a6-4ed6-b4e9-5191f0017930-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.122516 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a83088b-51a6-4ed6-b4e9-5191f0017930-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.122542 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a83088b-51a6-4ed6-b4e9-5191f0017930-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.122686 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svcbn\" (UniqueName: \"kubernetes.io/projected/2a83088b-51a6-4ed6-b4e9-5191f0017930-kube-api-access-svcbn\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.135120 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.146630 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.166750 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.178697 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.194341 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.210738 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.223573 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a83088b-51a6-4ed6-b4e9-5191f0017930-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.223678 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a83088b-51a6-4ed6-b4e9-5191f0017930-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.223742 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svcbn\" (UniqueName: \"kubernetes.io/projected/2a83088b-51a6-4ed6-b4e9-5191f0017930-kube-api-access-svcbn\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.224082 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a83088b-51a6-4ed6-b4e9-5191f0017930-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.224917 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a83088b-51a6-4ed6-b4e9-5191f0017930-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.225282 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a83088b-51a6-4ed6-b4e9-5191f0017930-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.225691 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.225763 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.225790 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.225815 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.225832 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.227127 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.231474 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a83088b-51a6-4ed6-b4e9-5191f0017930-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.243424 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.255059 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svcbn\" (UniqueName: \"kubernetes.io/projected/2a83088b-51a6-4ed6-b4e9-5191f0017930-kube-api-access-svcbn\") pod \"ovnkube-control-plane-749d76644c-bdngr\" (UID: \"2a83088b-51a6-4ed6-b4e9-5191f0017930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.259056 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.276512 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.290820 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.308634 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.320502 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.324951 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.328426 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.328447 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.328456 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.328470 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.328479 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.339945 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.353565 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.366883 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.386307 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.407226 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:43Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.430861 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.430919 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.430932 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.430950 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.430979 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.534442 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.534484 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.534494 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.534511 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.534523 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.569444 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 16:13:59.14194926 +0000 UTC Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.638423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.638472 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.638489 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.638514 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.638532 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.741325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.741365 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.741376 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.741390 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.741401 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.844813 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.844891 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.844910 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.844938 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.844998 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.904868 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" event={"ID":"2a83088b-51a6-4ed6-b4e9-5191f0017930","Type":"ContainerStarted","Data":"692df22f093d3d6cba90258d298204942e718f15f7706f56db57bf2ee8df94f3"} Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.947988 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.948043 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.948053 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.948074 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:43 crc kubenswrapper[4874]: I0126 00:07:43.948086 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:43Z","lastTransitionTime":"2026-01-26T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.050886 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.050945 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.051004 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.051038 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.051057 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.124856 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qkq5t"] Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.125364 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.125440 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.142655 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.153780 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.153837 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.153852 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.153872 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.153884 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.159201 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.173761 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.190052 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.210791 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.226881 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.236639 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.236695 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzrv\" (UniqueName: \"kubernetes.io/projected/489242ac-41c4-45f2-9828-58c4012e9f64-kube-api-access-7vzrv\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.239236 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.256226 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.256918 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.256976 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.256988 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.257011 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.257023 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.276769 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.314794 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.331565 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.337396 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.337443 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzrv\" (UniqueName: \"kubernetes.io/projected/489242ac-41c4-45f2-9828-58c4012e9f64-kube-api-access-7vzrv\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.337544 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.337598 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:44.83758258 +0000 UTC m=+39.571689117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.351760 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.359284 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.359325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.359334 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.359385 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.359398 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.364461 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzrv\" (UniqueName: \"kubernetes.io/projected/489242ac-41c4-45f2-9828-58c4012e9f64-kube-api-access-7vzrv\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.375444 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.391665 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.405061 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.416272 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.425664 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.462446 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.462501 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.462515 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.462534 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.462547 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.565880 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.565999 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.566020 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.566050 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.566073 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.570055 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:28:48.205415263 +0000 UTC Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.620532 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.620694 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.620929 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.620997 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.621099 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.621224 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.670334 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.670418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.670440 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.670472 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.670493 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.774232 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.774294 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.774310 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.774334 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.774356 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.842730 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.843041 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.843177 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:45.84314386 +0000 UTC m=+40.577250427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.878781 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.878858 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.878876 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.878905 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.878924 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.911795 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/1.log" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.912892 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/0.log" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.917188 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f" exitCode=1 Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.917264 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.917399 4874 scope.go:117] "RemoveContainer" containerID="95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.918617 4874 scope.go:117] "RemoveContainer" containerID="e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f" Jan 26 00:07:44 crc kubenswrapper[4874]: E0126 00:07:44.918871 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.921196 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" event={"ID":"2a83088b-51a6-4ed6-b4e9-5191f0017930","Type":"ContainerStarted","Data":"508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.921258 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" event={"ID":"2a83088b-51a6-4ed6-b4e9-5191f0017930","Type":"ContainerStarted","Data":"88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.939432 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.964462 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.982454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.982524 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.982548 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.982581 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.982608 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:44Z","lastTransitionTime":"2026-01-26T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:44 crc kubenswrapper[4874]: I0126 00:07:44.985527 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:44Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.002992 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.026465 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.044663 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.064602 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.085298 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.085344 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.085361 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.085384 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.085401 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.086139 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.108604 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.132049 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.147517 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.166758 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.189496 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.189567 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.189590 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.189622 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.189644 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.200151 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.222779 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.245101 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.266879 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.282327 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.293439 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.293498 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.293525 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.293562 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.293586 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.304309 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.323323 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.348545 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.362605 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.385562 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.397191 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.397264 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.397284 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.397311 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.397329 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.408471 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.428912 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.447275 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.464412 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.495360 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.500718 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.500774 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.500793 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.500817 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.500837 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.519032 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.543064 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.565025 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.570284 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 00:00:51.156895429 +0000 UTC Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.600226 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.604840 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.604891 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.604910 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.604936 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.604956 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.620827 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:45 crc kubenswrapper[4874]: E0126 00:07:45.621032 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.622362 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.641228 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.665945 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.690193 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.706862 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.706982 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.707004 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.707061 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.707076 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.708681 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.728254 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.739689 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.750835 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.763994 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.776035 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.788394 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.810347 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.810408 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.810423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.810445 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.810459 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.814922 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cbcfbbde9d60c8c5d25e60b08eb4778fbbb2365e55c423d3b7a674cd29b1df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:40Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:40.645818 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:40.645851 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:40.645867 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:40.646068 6173 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0126 00:07:40.646674 6173 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:40.647079 6173 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:40.647094 6173 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:40.647106 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:07:40.647138 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0126 00:07:40.647153 6173 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:40.647159 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:07:40.647170 6173 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:40.647185 6173 factory.go:656] Stopping watch factory\\\\nI0126 00:07:40.647199 6173 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.827624 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.837201 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.848127 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.853473 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:45 crc kubenswrapper[4874]: E0126 00:07:45.853707 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:45 crc kubenswrapper[4874]: E0126 00:07:45.853832 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:47.853803708 +0000 UTC m=+42.587910285 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.864326 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.877233 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.892925 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.902117 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.913908 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.914005 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.914032 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.914064 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.914085 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:45Z","lastTransitionTime":"2026-01-26T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.915871 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:45 crc kubenswrapper[4874]: I0126 00:07:45.927493 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/1.log" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.017009 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.017084 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.017111 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.017143 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.017167 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.120609 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.120670 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.120688 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.120717 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.120736 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.224197 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.224264 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.224286 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.224316 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.224339 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.327777 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.327845 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.327862 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.327888 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.327907 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.432341 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.432396 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.432423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.432453 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.432474 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.536651 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.536715 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.536738 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.536770 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.536793 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.570464 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 23:53:19.038626322 +0000 UTC Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.620289 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.620308 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:46 crc kubenswrapper[4874]: E0126 00:07:46.620577 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.620309 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:46 crc kubenswrapper[4874]: E0126 00:07:46.620714 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:46 crc kubenswrapper[4874]: E0126 00:07:46.620835 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.641004 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.641072 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.641090 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.641119 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.641139 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.744336 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.744446 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.744465 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.744491 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.744510 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.847749 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.847812 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.847829 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.847854 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.847874 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.951596 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.951649 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.951671 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.951699 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:46 crc kubenswrapper[4874]: I0126 00:07:46.951719 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:46Z","lastTransitionTime":"2026-01-26T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.056136 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.056204 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.056217 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.056237 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.056249 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.079857 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.081340 4874 scope.go:117] "RemoveContainer" containerID="e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.081686 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.101852 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.128392 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.150249 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.159828 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.159890 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.159902 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.159924 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.159939 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.170530 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.176213 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.176285 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.176303 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.176328 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.176346 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.187540 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.191891 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.197353 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.197421 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.197439 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.197466 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.197488 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.202587 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.216740 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.221932 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.222132 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.222165 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.222183 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.222209 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.222229 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.241191 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.243070 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.248423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.248527 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.248548 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.248574 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.248594 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.261156 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.272065 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.276652 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.276797 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.276889 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.277000 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.277084 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.292561 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.292821 4874 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.294808 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.294948 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.295064 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.295259 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.295363 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.295416 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.310477 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.327422 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.346099 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.366195 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.384862 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.399065 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.399121 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.399139 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.399165 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.399184 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.411361 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.432207 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:47Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.501949 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.502025 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.502042 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.502067 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.502085 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.571068 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 13:40:00.165129747 +0000 UTC Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.605744 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.605830 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.605854 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.605886 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.605912 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.620261 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.620580 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.709635 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.709698 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.709717 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.709742 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.709764 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.813127 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.813177 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.813190 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.813211 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.813223 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.879222 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.879498 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:47 crc kubenswrapper[4874]: E0126 00:07:47.879657 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:51.879621336 +0000 UTC m=+46.613727913 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.916688 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.916779 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.916798 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.916827 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:47 crc kubenswrapper[4874]: I0126 00:07:47.916846 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:47Z","lastTransitionTime":"2026-01-26T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.020189 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.020257 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.020283 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.020316 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.020338 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.124012 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.124056 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.124067 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.124087 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.124101 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.227427 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.227490 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.227506 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.227530 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.227548 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.330517 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.330595 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.330613 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.330638 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.330660 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.434488 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.434578 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.434604 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.434644 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.434670 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.537836 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.537896 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.537913 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.537937 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.537955 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.571514 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 08:26:19.83845768 +0000 UTC Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.620113 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.620190 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.620118 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:48 crc kubenswrapper[4874]: E0126 00:07:48.620344 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:48 crc kubenswrapper[4874]: E0126 00:07:48.620555 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:48 crc kubenswrapper[4874]: E0126 00:07:48.620760 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.641179 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.641256 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.641279 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.641310 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.641332 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.744857 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.744932 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.744995 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.745030 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.745088 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.847825 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.847908 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.847934 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.848017 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.848044 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.950392 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.950470 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.950492 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.950519 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:48 crc kubenswrapper[4874]: I0126 00:07:48.950543 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:48Z","lastTransitionTime":"2026-01-26T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.053498 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.053554 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.053571 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.053595 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.053612 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.156330 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.156398 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.156416 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.156441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.156460 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.260357 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.260431 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.260454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.260484 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.260506 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.363531 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.363592 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.363610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.363633 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.363650 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.467357 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.467437 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.467459 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.467495 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.467517 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.571763 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.571837 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.571760 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 21:53:14.997750716 +0000 UTC Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.571862 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.571935 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.572005 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.619858 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:49 crc kubenswrapper[4874]: E0126 00:07:49.620118 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.675433 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.675492 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.675511 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.675535 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.675558 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.779052 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.779125 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.779144 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.779170 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.779190 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.882076 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.882110 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.882118 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.882132 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.882142 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.985168 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.985213 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.985224 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.985240 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:49 crc kubenswrapper[4874]: I0126 00:07:49.985253 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:49Z","lastTransitionTime":"2026-01-26T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.088475 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.088527 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.088539 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.088559 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.088572 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.191603 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.191674 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.191699 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.191725 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.191742 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.294949 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.295065 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.295089 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.295122 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.295146 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.398251 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.398298 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.398313 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.398334 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.398346 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.501146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.501215 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.501235 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.501267 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.501291 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.572637 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 12:26:46.047030082 +0000 UTC Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.604434 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.604501 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.604517 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.604542 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.604562 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.620220 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.620289 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.620409 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:50 crc kubenswrapper[4874]: E0126 00:07:50.620632 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:50 crc kubenswrapper[4874]: E0126 00:07:50.620772 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:50 crc kubenswrapper[4874]: E0126 00:07:50.621015 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.708069 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.708146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.708175 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.708215 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.708237 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.811904 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.812339 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.812533 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.812734 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.812929 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.916467 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.916520 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.916537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.916561 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:50 crc kubenswrapper[4874]: I0126 00:07:50.916580 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:50Z","lastTransitionTime":"2026-01-26T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.020202 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.020272 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.020291 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.020319 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.020337 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.123257 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.123331 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.123342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.123365 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.123378 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.226761 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.226832 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.226851 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.226882 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.226904 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.330495 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.330570 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.330588 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.330615 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.330640 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.434587 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.434658 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.434678 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.434705 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.434724 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.538537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.538610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.538627 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.538654 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.538673 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.573084 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:40:20.933808612 +0000 UTC Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.619808 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:51 crc kubenswrapper[4874]: E0126 00:07:51.620082 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.642153 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.642255 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.642287 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.642324 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.642350 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.745855 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.745926 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.745954 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.746025 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.746049 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.852854 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.852927 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.852951 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.853017 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.853039 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.924671 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:51 crc kubenswrapper[4874]: E0126 00:07:51.925371 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:51 crc kubenswrapper[4874]: E0126 00:07:51.925660 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:07:59.925622167 +0000 UTC m=+54.659728744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.955465 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.955525 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.955542 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.955568 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:51 crc kubenswrapper[4874]: I0126 00:07:51.955585 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:51Z","lastTransitionTime":"2026-01-26T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.058509 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.058589 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.058612 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.058640 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.058659 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.162356 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.162420 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.162432 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.162452 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.162465 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.266070 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.266148 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.266179 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.266210 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.266234 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.369824 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.369908 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.369926 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.369955 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.370020 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.473255 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.473314 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.473333 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.473356 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.473375 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.573589 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 02:16:02.047435285 +0000 UTC Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.577484 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.577572 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.577597 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.577634 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.577659 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.620162 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.620226 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.620172 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:52 crc kubenswrapper[4874]: E0126 00:07:52.620367 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:52 crc kubenswrapper[4874]: E0126 00:07:52.620455 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:52 crc kubenswrapper[4874]: E0126 00:07:52.620567 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.682171 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.682255 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.682273 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.682299 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.682316 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.785514 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.785601 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.785634 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.785666 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.785694 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.888827 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.888899 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.888924 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.888992 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.889031 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.992643 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.992728 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.992752 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.992785 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:52 crc kubenswrapper[4874]: I0126 00:07:52.992810 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:52Z","lastTransitionTime":"2026-01-26T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.096514 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.096570 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.096588 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.096611 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.096630 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.200394 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.200462 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.200479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.200553 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.200573 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.303942 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.304044 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.304067 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.304097 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.304115 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.408462 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.408536 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.408553 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.408583 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.408601 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.524882 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.524933 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.524944 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.524991 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.525008 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.574539 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 21:21:07.568360593 +0000 UTC Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.619824 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:53 crc kubenswrapper[4874]: E0126 00:07:53.620100 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.631377 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.631430 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.631447 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.631476 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.631494 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.735241 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.735303 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.735325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.735355 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.735378 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.838133 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.838204 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.838227 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.838256 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.838279 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.941263 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.941326 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.941351 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.941383 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:53 crc kubenswrapper[4874]: I0126 00:07:53.941405 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:53Z","lastTransitionTime":"2026-01-26T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.045515 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.045582 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.045607 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.045640 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.045666 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.149503 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.149588 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.149610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.149644 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.149667 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.252935 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.253047 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.253073 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.253104 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.253127 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.356450 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.356535 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.356555 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.356579 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.356598 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.459267 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.459343 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.459372 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.459409 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.459436 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.563111 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.563507 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.563545 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.563572 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.563589 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.575072 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 14:42:28.200866084 +0000 UTC Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.619816 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.619864 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:54 crc kubenswrapper[4874]: E0126 00:07:54.620079 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.620207 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:54 crc kubenswrapper[4874]: E0126 00:07:54.620289 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:54 crc kubenswrapper[4874]: E0126 00:07:54.620447 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.667074 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.667156 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.667284 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.667315 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.667339 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.771296 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.771382 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.771413 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.771447 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.771480 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.874991 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.875059 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.875080 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.875105 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.875125 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.977989 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.978048 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.978068 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.978093 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:54 crc kubenswrapper[4874]: I0126 00:07:54.978111 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:54Z","lastTransitionTime":"2026-01-26T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.081577 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.081642 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.081659 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.081684 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.081701 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.185465 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.185543 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.185561 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.185590 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.185609 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.265244 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.265641 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:08:27.265597665 +0000 UTC m=+81.999704232 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.288723 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.288780 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.288796 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.288823 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.288839 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.366816 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.366904 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.367006 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.367045 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367241 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367269 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367289 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367377 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:08:27.367352618 +0000 UTC m=+82.101459185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367386 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367532 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:08:27.367494032 +0000 UTC m=+82.101600599 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367409 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367594 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367668 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367692 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367765 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:08:27.367741319 +0000 UTC m=+82.101847946 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.367812 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:08:27.36779496 +0000 UTC m=+82.101901657 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.392183 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.392263 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.392288 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.392319 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.392345 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.495070 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.495135 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.495151 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.495181 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.495205 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.575629 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 00:21:24.420788522 +0000 UTC Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.598865 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.598945 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.598998 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.599030 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.599054 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.620241 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:55 crc kubenswrapper[4874]: E0126 00:07:55.620422 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.641933 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.674271 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.693113 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.702717 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.702773 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.702791 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.702815 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.702835 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.714613 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.740269 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.757055 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.771867 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.786011 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.807325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.807394 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.807405 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.807423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.807435 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.810893 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.828665 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.845175 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.864552 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.894327 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.908367 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.910727 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.910768 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.910779 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.910799 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.910815 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:55Z","lastTransitionTime":"2026-01-26T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.926653 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.942163 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:55 crc kubenswrapper[4874]: I0126 00:07:55.957602 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.014178 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.014279 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.014290 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.014305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.014315 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.117022 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.117095 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.117118 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.117150 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.117174 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.221055 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.221135 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.221159 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.221192 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.221218 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.324351 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.324454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.324478 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.324509 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.324534 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.427870 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.427933 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.427950 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.428003 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.428021 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.531420 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.531466 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.531476 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.531494 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.531507 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.576407 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 17:53:09.736628672 +0000 UTC Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.620644 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.620644 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.620667 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:56 crc kubenswrapper[4874]: E0126 00:07:56.620861 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:56 crc kubenswrapper[4874]: E0126 00:07:56.621134 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:56 crc kubenswrapper[4874]: E0126 00:07:56.621234 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.634708 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.634765 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.634782 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.634809 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.634827 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.738437 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.738508 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.738526 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.738556 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.738596 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.841952 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.842079 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.842111 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.842144 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.842167 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.946131 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.946197 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.946215 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.946247 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:56 crc kubenswrapper[4874]: I0126 00:07:56.946267 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:56Z","lastTransitionTime":"2026-01-26T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.050562 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.050638 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.050662 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.050695 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.050721 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.153385 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.153476 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.153504 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.153539 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.153564 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.257303 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.257375 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.257393 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.257418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.257436 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.360486 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.360568 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.360590 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.360614 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.360631 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.459645 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.459717 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.459739 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.459772 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.459795 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: E0126 00:07:57.482043 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:57Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.487617 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.487675 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.487692 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.487717 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.487736 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: E0126 00:07:57.507342 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:57Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.512495 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.512551 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.512574 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.512603 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.512624 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: E0126 00:07:57.536820 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:57Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.542292 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.542345 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.542362 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.542387 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.542405 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: E0126 00:07:57.562152 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:57Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.567556 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.567610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.567627 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.567649 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.567665 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.576796 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 20:36:33.401746464 +0000 UTC Jan 26 00:07:57 crc kubenswrapper[4874]: E0126 00:07:57.586807 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:07:57Z is after 2025-08-24T17:21:41Z" Jan 26 00:07:57 crc kubenswrapper[4874]: E0126 00:07:57.587068 4874 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.589629 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.589698 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.589720 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.589750 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.589772 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.620358 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:57 crc kubenswrapper[4874]: E0126 00:07:57.620541 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.693461 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.693543 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.693568 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.693602 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.693628 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.796445 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.796587 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.796605 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.796628 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.796648 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.900001 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.900067 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.900086 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.900111 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:57 crc kubenswrapper[4874]: I0126 00:07:57.900127 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:57Z","lastTransitionTime":"2026-01-26T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.003298 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.003364 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.003386 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.003420 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.003442 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.106346 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.106414 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.106441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.106471 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.106494 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.209662 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.209728 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.209750 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.209783 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.209805 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.313027 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.313097 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.313126 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.313155 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.313175 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.416496 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.416543 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.416559 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.416609 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.416626 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.519819 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.519892 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.519933 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.520001 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.520024 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.589755 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 03:43:20.201735084 +0000 UTC Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.620091 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.620112 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.620208 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:07:58 crc kubenswrapper[4874]: E0126 00:07:58.620269 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:07:58 crc kubenswrapper[4874]: E0126 00:07:58.620349 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:07:58 crc kubenswrapper[4874]: E0126 00:07:58.620452 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.623212 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.623273 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.623297 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.623325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.623342 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.725897 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.725999 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.726023 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.726053 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.726076 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.829578 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.829624 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.829642 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.829666 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.829682 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.932626 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.933041 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.933224 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.933416 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:58 crc kubenswrapper[4874]: I0126 00:07:58.933843 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:58Z","lastTransitionTime":"2026-01-26T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.037825 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.037895 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.037913 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.037939 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.037990 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.142387 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.142438 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.142454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.142478 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.142500 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.245357 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.245424 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.245441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.245464 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.245481 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.348578 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.348620 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.348636 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.348658 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.348676 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.451327 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.451688 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.452046 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.452228 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.452397 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.555347 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.555888 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.556129 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.556296 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.556443 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.592057 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 19:40:31.131036478 +0000 UTC Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.620352 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:07:59 crc kubenswrapper[4874]: E0126 00:07:59.620694 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.660463 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.660503 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.660516 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.660535 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.660547 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.763693 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.763747 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.763766 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.763793 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.763813 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.867613 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.867685 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.867709 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.867736 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.867757 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.971765 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.971837 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.971863 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.971890 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:07:59 crc kubenswrapper[4874]: I0126 00:07:59.971909 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:07:59Z","lastTransitionTime":"2026-01-26T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.023106 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:00 crc kubenswrapper[4874]: E0126 00:08:00.023647 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:08:00 crc kubenswrapper[4874]: E0126 00:08:00.023888 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:08:16.023857191 +0000 UTC m=+70.757963758 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.074895 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.074991 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.075016 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.075093 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.075117 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.177695 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.177743 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.177752 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.177767 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.177778 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.280250 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.280312 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.280330 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.280355 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.280373 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.382776 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.382878 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.382903 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.382931 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.382948 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.485818 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.486106 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.486170 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.486234 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.486302 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.588334 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.588397 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.588414 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.588442 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.588460 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.592949 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:32:46.120805105 +0000 UTC Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.619674 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.619744 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:00 crc kubenswrapper[4874]: E0126 00:08:00.619952 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.620063 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:00 crc kubenswrapper[4874]: E0126 00:08:00.620262 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:00 crc kubenswrapper[4874]: E0126 00:08:00.620425 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.691810 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.691863 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.691880 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.691903 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.691921 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.794763 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.794819 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.794836 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.794858 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.794909 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.858087 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.871459 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.881312 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:00Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.897454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.897518 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.897542 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.897572 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.897596 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:00Z","lastTransitionTime":"2026-01-26T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.907178 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:00Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.924249 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:00Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.958314 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:00Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:00 crc kubenswrapper[4874]: I0126 00:08:00.980247 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:00Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.000705 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:00Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.002721 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.002776 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.002789 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.002819 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.002836 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.017285 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.037166 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.048573 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.061533 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.082166 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.093541 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.103511 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.105210 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.105233 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.105266 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.105281 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.105291 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.118557 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.130298 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.146348 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.158777 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:01Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.207571 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.207608 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.207618 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.207634 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.207645 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.310800 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.310875 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.310899 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.310932 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.310956 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.412813 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.412849 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.412858 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.412873 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.412882 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.516508 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.516545 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.516556 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.516575 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.516588 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.594160 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:04:57.917862014 +0000 UTC Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.619569 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.619954 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.620027 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.620044 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.620074 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: E0126 00:08:01.620029 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.620092 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.723580 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.723638 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.723650 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.723675 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.723688 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.826419 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.826505 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.826529 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.826559 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.826586 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.931458 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.931505 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.931517 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.931535 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:01 crc kubenswrapper[4874]: I0126 00:08:01.931548 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:01Z","lastTransitionTime":"2026-01-26T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.033981 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.034017 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.034026 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.034040 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.034052 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.136377 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.136411 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.136418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.136466 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.136476 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.239713 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.239792 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.239811 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.239842 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.239866 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.343076 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.343154 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.343197 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.343223 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.343241 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.447406 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.447481 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.447524 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.447554 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.447578 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.550847 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.550884 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.550895 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.550910 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.550921 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.594592 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 19:29:29.375506422 +0000 UTC Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.619996 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.620095 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:02 crc kubenswrapper[4874]: E0126 00:08:02.620155 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.620182 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:02 crc kubenswrapper[4874]: E0126 00:08:02.620365 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:02 crc kubenswrapper[4874]: E0126 00:08:02.620514 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.621303 4874 scope.go:117] "RemoveContainer" containerID="e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.653878 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.654231 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.654244 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.654264 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.654276 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.757049 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.757113 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.757124 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.757146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.757158 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.861678 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.861721 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.861732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.861748 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.861761 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.964110 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.964149 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.964161 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.964178 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:02 crc kubenswrapper[4874]: I0126 00:08:02.964196 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:02Z","lastTransitionTime":"2026-01-26T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.003807 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/1.log" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.006278 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.007306 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.022668 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.042109 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.056593 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.066925 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.067018 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.067035 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.067058 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.067075 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.074377 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.085254 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.104621 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.123652 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.138024 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.149214 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.160393 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.170247 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.170289 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.170298 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.170264 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.170317 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.170506 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.184257 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.202792 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.218044 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.229833 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.242111 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.278904 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.278978 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.278991 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.279009 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.279024 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.280510 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.290973 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.380744 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.380778 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.380786 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.380799 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.380808 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.483753 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.483815 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.483832 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.483857 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.483875 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.586873 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.586928 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.586946 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.586997 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.587018 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.595184 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:53:25.95404423 +0000 UTC Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.619898 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:03 crc kubenswrapper[4874]: E0126 00:08:03.620145 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.689293 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.689338 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.689355 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.689378 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.689394 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.792466 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.792547 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.792567 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.792778 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.792796 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.896035 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.896155 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.896176 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.896203 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.896223 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.998530 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.998605 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.998629 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.998659 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:03 crc kubenswrapper[4874]: I0126 00:08:03.998681 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:03Z","lastTransitionTime":"2026-01-26T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.012944 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/2.log" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.014054 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/1.log" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.018683 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c" exitCode=1 Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.018740 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.018796 4874 scope.go:117] "RemoveContainer" containerID="e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.019772 4874 scope.go:117] "RemoveContainer" containerID="82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c" Jan 26 00:08:04 crc kubenswrapper[4874]: E0126 00:08:04.020080 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.044777 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.078781 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.099407 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.102590 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.102646 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.102665 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.102692 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.102710 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.115171 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.135220 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.150578 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.171227 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.188853 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.208413 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.208486 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.208503 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.208529 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.208546 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.208550 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.225396 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.255923 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3cfcf927532110fdc1ebc6607861afa57312046b7cec911ad348fdc554ef56f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"3.099403 6317 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099465 6317 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:07:43.099861 6317 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:07:43.099874 6317 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0126 00:07:43.099907 6317 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:07:43.099937 6317 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0126 00:07:43.099948 6317 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0126 00:07:43.099952 6317 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0126 00:07:43.099977 6317 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0126 00:07:43.099983 6317 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0126 00:07:43.099990 6317 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0126 00:07:43.099997 6317 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0126 00:07:43.100010 6317 factory.go:656] Stopping watch factory\\\\nI0126 00:07:43.100011 6317 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0126 00:07:43.100021 6317 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.271715 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.289942 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.310897 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.311654 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.311735 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.311753 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.311784 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.311804 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.329844 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.347307 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.371923 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.385267 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:04Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.415093 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.415156 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.415174 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.415253 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.415284 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.518573 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.518661 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.518681 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.518706 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.518723 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.595719 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 05:23:41.917134286 +0000 UTC Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.619747 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.619779 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.619830 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:04 crc kubenswrapper[4874]: E0126 00:08:04.620021 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:04 crc kubenswrapper[4874]: E0126 00:08:04.620136 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:04 crc kubenswrapper[4874]: E0126 00:08:04.620318 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.621498 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.621558 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.621576 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.621600 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.621620 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.725152 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.725211 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.725229 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.725254 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.725270 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.828320 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.828374 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.828391 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.828415 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.828432 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.932502 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.932665 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.932702 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.932734 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:04 crc kubenswrapper[4874]: I0126 00:08:04.932760 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:04Z","lastTransitionTime":"2026-01-26T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.031998 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/2.log" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.035329 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.035395 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.035414 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.035439 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.035456 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.038127 4874 scope.go:117] "RemoveContainer" containerID="82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c" Jan 26 00:08:05 crc kubenswrapper[4874]: E0126 00:08:05.038603 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.055866 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.069356 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.088565 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.103325 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.118039 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.129740 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.138831 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.138902 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.138925 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.138954 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.139028 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.150921 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.163421 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.181201 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.195013 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.211687 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.228647 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.242455 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.242521 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.242544 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.242605 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.242633 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.245721 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.277317 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.296223 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.312429 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.328773 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.342663 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.346034 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.346078 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.346093 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.346114 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.346131 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.448822 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.448887 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.448904 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.448936 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.448956 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.551955 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.552121 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.552143 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.552168 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.552188 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.596263 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:00:48.632830529 +0000 UTC Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.619491 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:05 crc kubenswrapper[4874]: E0126 00:08:05.619835 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.648918 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.655005 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.655072 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.655089 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.655111 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.655128 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.673717 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.692519 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.715129 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.737065 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.758123 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.758175 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.758189 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.758211 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.758226 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.761023 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.777157 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.795842 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.815870 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.835545 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.859222 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.862123 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.862188 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.862207 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.862233 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.862251 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.876266 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.897550 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.930439 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.953669 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.964709 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.964778 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.964803 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.964839 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.964864 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:05Z","lastTransitionTime":"2026-01-26T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.976384 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:05 crc kubenswrapper[4874]: I0126 00:08:05.994819 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:05Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.009558 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:06Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.072899 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.072951 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.072994 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.073018 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.073041 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.176615 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.176668 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.176684 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.176709 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.176727 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.279607 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.279677 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.279699 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.279729 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.279751 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.382548 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.382608 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.382731 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.382764 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.382789 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.485935 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.486035 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.486052 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.486079 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.486096 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.589643 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.589713 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.589732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.589767 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.589785 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.596841 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 09:09:22.640614893 +0000 UTC Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.619779 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.619862 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.619927 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:06 crc kubenswrapper[4874]: E0126 00:08:06.620021 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:06 crc kubenswrapper[4874]: E0126 00:08:06.620094 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:06 crc kubenswrapper[4874]: E0126 00:08:06.620265 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.692879 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.692941 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.692979 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.693002 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.693021 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.797024 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.797089 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.797105 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.797135 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.797154 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.899592 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.899649 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.899665 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.899689 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:06 crc kubenswrapper[4874]: I0126 00:08:06.899707 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:06Z","lastTransitionTime":"2026-01-26T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.002534 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.002605 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.002618 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.002659 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.002682 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.106509 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.106569 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.106586 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.106653 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.106671 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.210046 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.210111 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.210129 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.210157 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.210174 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.313301 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.313347 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.313369 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.313390 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.313404 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.416372 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.416438 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.416454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.416481 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.416501 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.520059 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.520139 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.520158 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.520187 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.520207 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.597917 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 09:18:30.773360556 +0000 UTC Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.620803 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:07 crc kubenswrapper[4874]: E0126 00:08:07.620943 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.627164 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.627190 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.627198 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.627208 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.627219 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.664305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.664337 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.664346 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.664361 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.664370 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: E0126 00:08:07.679637 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:07Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.683748 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.683795 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.683813 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.683838 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.683855 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: E0126 00:08:07.702938 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:07Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.713026 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.713118 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.713139 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.713196 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.713218 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: E0126 00:08:07.732916 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:07Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.738376 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.738421 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.738432 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.738450 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.738465 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: E0126 00:08:07.758804 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:07Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.769219 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.769286 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.769306 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.769332 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.769381 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: E0126 00:08:07.786139 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:07Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:07 crc kubenswrapper[4874]: E0126 00:08:07.786371 4874 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.789262 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.789321 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.789344 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.789377 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.789400 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.892351 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.892404 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.892422 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.892449 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.892464 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.994808 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.994850 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.994861 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.994877 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:07 crc kubenswrapper[4874]: I0126 00:08:07.994889 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:07Z","lastTransitionTime":"2026-01-26T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.097588 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.097650 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.097671 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.097699 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.097722 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.201053 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.201118 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.201260 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.201313 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.201338 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.304066 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.304126 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.304144 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.304170 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.304188 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.407651 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.407727 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.407753 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.407787 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.407810 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.510692 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.510751 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.510773 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.510802 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.510825 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.598140 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 18:53:25.180712087 +0000 UTC Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.613757 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.613799 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.613807 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.613819 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.613828 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.620072 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.620153 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.620224 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:08 crc kubenswrapper[4874]: E0126 00:08:08.620323 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:08 crc kubenswrapper[4874]: E0126 00:08:08.620424 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:08 crc kubenswrapper[4874]: E0126 00:08:08.620523 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.730473 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.730549 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.730573 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.730597 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.730614 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.833819 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.833856 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.833867 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.833883 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.833892 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.937182 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.937237 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.937254 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.937281 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:08 crc kubenswrapper[4874]: I0126 00:08:08.937300 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:08Z","lastTransitionTime":"2026-01-26T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.041102 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.041147 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.041163 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.041191 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.041215 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.144071 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.144125 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.144142 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.144166 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.144189 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.247388 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.247424 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.247433 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.247449 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.247463 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.350868 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.350905 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.350915 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.350931 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.350940 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.454049 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.454093 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.454105 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.454124 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.454136 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.557180 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.557216 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.557224 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.557240 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.557250 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.598869 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 11:32:49.519927811 +0000 UTC Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.620252 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:09 crc kubenswrapper[4874]: E0126 00:08:09.620411 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.660384 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.660465 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.660485 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.660505 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.660534 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.763121 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.763181 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.763193 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.763211 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.763223 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.866914 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.866975 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.866986 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.867003 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.867016 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.969916 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.970006 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.970024 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.970050 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:09 crc kubenswrapper[4874]: I0126 00:08:09.970069 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:09Z","lastTransitionTime":"2026-01-26T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.072234 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.072294 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.072314 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.072342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.072360 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.175105 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.175174 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.175185 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.175223 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.175233 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.277499 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.277566 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.277583 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.277609 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.277627 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.385040 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.386327 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.386345 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.386373 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.386389 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.489426 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.489483 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.489494 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.489508 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.489517 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.592692 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.592754 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.592766 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.592791 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.592806 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.599299 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 06:05:59.413687564 +0000 UTC Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.619686 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.619776 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.619816 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:10 crc kubenswrapper[4874]: E0126 00:08:10.619992 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:10 crc kubenswrapper[4874]: E0126 00:08:10.620123 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:10 crc kubenswrapper[4874]: E0126 00:08:10.620245 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.696246 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.696305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.696322 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.696348 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.696366 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.799921 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.800053 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.800078 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.800115 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.800139 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.902792 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.902875 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.902893 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.902924 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:10 crc kubenswrapper[4874]: I0126 00:08:10.902940 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:10Z","lastTransitionTime":"2026-01-26T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.005891 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.005949 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.006002 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.006029 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.006050 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.109092 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.109130 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.109139 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.109155 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.109164 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.211650 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.211723 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.211747 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.211778 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.211801 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.314656 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.314735 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.314761 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.314794 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.314819 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.417241 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.417275 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.417285 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.417302 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.417314 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.519921 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.519989 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.520002 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.520025 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.520040 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.599945 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 17:59:05.305365729 +0000 UTC Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.619874 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:11 crc kubenswrapper[4874]: E0126 00:08:11.620082 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.622380 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.622437 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.622454 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.622478 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.622496 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.725477 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.725521 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.725534 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.725555 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.725570 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.828146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.828182 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.828191 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.828205 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.828215 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.931230 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.931361 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.931381 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.931407 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:11 crc kubenswrapper[4874]: I0126 00:08:11.931426 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:11Z","lastTransitionTime":"2026-01-26T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.033650 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.033713 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.033732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.033757 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.033776 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.137229 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.137302 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.137325 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.137358 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.137382 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.241155 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.241219 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.241237 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.241265 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.241283 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.344361 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.344418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.344435 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.344458 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.344477 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.448376 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.448469 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.448496 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.448537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.448559 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.551682 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.551755 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.551766 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.551789 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.551803 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.600313 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 17:08:48.133756231 +0000 UTC Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.619750 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.619821 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:12 crc kubenswrapper[4874]: E0126 00:08:12.620055 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.620120 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:12 crc kubenswrapper[4874]: E0126 00:08:12.620303 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:12 crc kubenswrapper[4874]: E0126 00:08:12.620553 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.660554 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.660608 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.660623 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.660646 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.660661 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.763441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.763516 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.763537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.763570 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.763589 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.867225 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.867318 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.867335 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.867363 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.867381 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.970235 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.970299 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.970323 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.970353 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:12 crc kubenswrapper[4874]: I0126 00:08:12.970372 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:12Z","lastTransitionTime":"2026-01-26T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.073236 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.073291 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.073307 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.073331 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.073348 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.176192 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.176250 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.176267 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.176291 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.176307 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.279138 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.279189 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.279202 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.279223 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.279236 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.381793 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.381879 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.381906 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.381940 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.381997 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.484893 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.484987 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.485006 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.485038 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.485060 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.587788 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.587839 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.587852 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.587875 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.587889 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.601198 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 01:18:30.884363961 +0000 UTC Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.619612 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:13 crc kubenswrapper[4874]: E0126 00:08:13.619842 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.690625 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.690680 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.690694 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.690714 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.690726 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.823713 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.823746 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.823760 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.823780 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.823792 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.926302 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.926360 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.926371 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.926396 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:13 crc kubenswrapper[4874]: I0126 00:08:13.926414 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:13Z","lastTransitionTime":"2026-01-26T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.029716 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.029772 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.029783 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.029804 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.029818 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.132029 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.132105 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.132126 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.132159 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.132180 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.235107 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.235181 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.235202 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.235237 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.235264 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.338130 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.338195 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.338214 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.338242 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.338263 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.441458 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.441542 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.441569 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.441605 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.441629 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.545733 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.545883 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.545916 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.546032 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.546114 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.602258 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:14:40.343238457 +0000 UTC Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.619745 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.619871 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.619745 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:14 crc kubenswrapper[4874]: E0126 00:08:14.620094 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:14 crc kubenswrapper[4874]: E0126 00:08:14.620059 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:14 crc kubenswrapper[4874]: E0126 00:08:14.620218 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.648860 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.648934 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.648946 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.648984 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.648998 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.752915 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.753021 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.753041 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.753067 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.753087 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.856362 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.856440 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.856467 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.856499 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.856537 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.960401 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.960508 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.960554 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.960585 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:14 crc kubenswrapper[4874]: I0126 00:08:14.960604 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:14Z","lastTransitionTime":"2026-01-26T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.064031 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.064085 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.064100 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.064123 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.064140 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.167050 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.167091 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.167104 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.167120 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.167132 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.270530 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.270584 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.270596 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.270617 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.270629 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.373227 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.373267 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.373283 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.373305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.373322 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.475782 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.475852 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.475869 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.475891 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.475908 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.579031 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.579079 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.579089 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.579108 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.579118 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.603291 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 15:45:33.260571059 +0000 UTC Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.619860 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:15 crc kubenswrapper[4874]: E0126 00:08:15.620025 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.642150 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.661827 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.678496 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.682994 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.683049 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.683072 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.683099 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.683117 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.692609 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.707834 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.726552 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.747796 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.766766 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.785748 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.785804 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.785825 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.785851 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.785869 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.785859 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.801757 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.830473 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.842749 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.855740 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.869223 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.888943 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.889003 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.889016 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.889035 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.889059 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.904555 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.920259 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.936693 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.955911 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:15Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.992753 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.992808 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.992828 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.992854 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:15 crc kubenswrapper[4874]: I0126 00:08:15.992872 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:15Z","lastTransitionTime":"2026-01-26T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.096405 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.096456 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.096473 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.096497 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.096515 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.110047 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:16 crc kubenswrapper[4874]: E0126 00:08:16.110234 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:08:16 crc kubenswrapper[4874]: E0126 00:08:16.110348 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:08:48.110318235 +0000 UTC m=+102.844424812 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.199696 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.199755 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.199774 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.199805 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.199827 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.302087 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.302140 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.302155 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.302179 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.302196 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.405432 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.405482 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.405501 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.405524 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.405541 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.508377 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.508426 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.508440 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.508460 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.508474 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.603650 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 19:05:46.852904729 +0000 UTC Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.610915 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.610985 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.611003 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.611023 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.611041 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.619370 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.619402 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:16 crc kubenswrapper[4874]: E0126 00:08:16.619469 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.619382 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:16 crc kubenswrapper[4874]: E0126 00:08:16.619567 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:16 crc kubenswrapper[4874]: E0126 00:08:16.619719 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.713900 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.713941 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.713953 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.713987 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.713999 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.817192 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.817259 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.817282 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.817307 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.817323 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.919704 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.919793 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.919827 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.919861 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:16 crc kubenswrapper[4874]: I0126 00:08:16.919882 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:16Z","lastTransitionTime":"2026-01-26T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.022627 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.022685 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.022700 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.022722 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.022735 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.126128 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.126360 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.126369 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.126388 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.126398 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.229354 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.229447 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.229467 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.229494 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.229512 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.332080 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.332143 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.332158 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.332177 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.332187 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.435594 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.435647 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.435664 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.435689 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.435706 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.538292 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.538339 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.538356 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.538381 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.538397 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.604292 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 22:19:57.377063916 +0000 UTC Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.621683 4874 scope.go:117] "RemoveContainer" containerID="82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c" Jan 26 00:08:17 crc kubenswrapper[4874]: E0126 00:08:17.622043 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.622214 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:17 crc kubenswrapper[4874]: E0126 00:08:17.622410 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.640582 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.640673 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.640700 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.640732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.640757 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.743462 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.743496 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.743504 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.743520 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.743529 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.847587 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.847670 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.847692 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.847718 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.847738 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.917940 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.918029 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.918056 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.918076 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.918093 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: E0126 00:08:17.935724 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:17Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.944771 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.944809 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.944821 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.944836 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.944846 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: E0126 00:08:17.961246 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:17Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.965360 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.965406 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.965423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.965443 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.965460 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:17 crc kubenswrapper[4874]: E0126 00:08:17.982615 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:17Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.988087 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.988172 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.988194 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.988255 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:17 crc kubenswrapper[4874]: I0126 00:08:17.988276 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:17Z","lastTransitionTime":"2026-01-26T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: E0126 00:08:18.005856 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.009899 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.009950 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.009993 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.010015 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.010031 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: E0126 00:08:18.026336 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: E0126 00:08:18.026700 4874 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.028945 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.029021 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.029038 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.029059 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.029101 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.086813 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/0.log" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.086873 4874 generic.go:334] "Generic (PLEG): container finished" podID="23d0d511-4c29-434d-92b1-74ca6909e53f" containerID="78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154" exitCode=1 Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.086915 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerDied","Data":"78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.087376 4874 scope.go:117] "RemoveContainer" containerID="78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.102856 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.119801 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.131697 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.131797 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.131852 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.131881 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.131898 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.134349 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.164355 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.182878 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.198362 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.215363 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.234420 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.234694 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.234844 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.235043 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.235219 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.242456 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.255026 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.269232 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.284223 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.296507 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.312323 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.329498 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.337448 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.337511 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.337523 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.337538 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.337548 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.343483 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.354124 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.364314 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.377018 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"2026-01-26T00:07:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf\\\\n2026-01-26T00:07:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf to /host/opt/cni/bin/\\\\n2026-01-26T00:07:32Z [verbose] multus-daemon started\\\\n2026-01-26T00:07:32Z [verbose] Readiness Indicator file check\\\\n2026-01-26T00:08:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.439648 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.439757 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.439838 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.439908 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.439988 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.543577 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.545130 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.545191 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.545253 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.545283 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.605302 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 22:58:30.950476634 +0000 UTC Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.620068 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.620146 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:18 crc kubenswrapper[4874]: E0126 00:08:18.620185 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.620146 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:18 crc kubenswrapper[4874]: E0126 00:08:18.620355 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:18 crc kubenswrapper[4874]: E0126 00:08:18.620528 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.647763 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.647791 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.647800 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.647815 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.647826 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.750643 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.750664 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.750672 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.750682 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.750689 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.853879 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.853920 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.853929 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.853949 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.853975 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.956383 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.956418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.956427 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.956442 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:18 crc kubenswrapper[4874]: I0126 00:08:18.956452 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:18Z","lastTransitionTime":"2026-01-26T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.059516 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.059617 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.059644 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.059676 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.059704 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.093816 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/0.log" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.093889 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerStarted","Data":"5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.116353 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.135054 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.156120 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.163090 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.163148 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.163166 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.163190 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.163209 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.177053 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.209199 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.226413 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.243843 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.263792 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.266123 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.266182 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.266199 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.266224 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.266242 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.282927 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.302087 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.325820 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.342110 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.360664 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"2026-01-26T00:07:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf\\\\n2026-01-26T00:07:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf to /host/opt/cni/bin/\\\\n2026-01-26T00:07:32Z [verbose] multus-daemon started\\\\n2026-01-26T00:07:32Z [verbose] Readiness Indicator file check\\\\n2026-01-26T00:08:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.369158 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.369226 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.369247 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.369272 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.369288 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.389213 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.405294 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.419725 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.433471 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.451150 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.472291 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.472342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.472358 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.472399 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.472417 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.575689 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.575769 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.575792 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.575823 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.575844 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.606011 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 20:27:53.958231206 +0000 UTC Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.620641 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:19 crc kubenswrapper[4874]: E0126 00:08:19.620868 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.678869 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.679038 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.679057 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.679083 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.679102 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.781753 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.781788 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.781799 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.781816 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.781828 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.885229 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.885296 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.885319 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.885349 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.885373 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.988925 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.989012 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.989032 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.989060 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:19 crc kubenswrapper[4874]: I0126 00:08:19.989079 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:19Z","lastTransitionTime":"2026-01-26T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.091718 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.091793 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.091811 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.091836 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.091854 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.194931 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.195027 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.195046 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.195072 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.195091 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.297949 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.298069 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.298096 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.298127 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.298148 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.401357 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.401443 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.401467 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.401497 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.401518 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.510035 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.510089 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.510112 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.510151 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.510172 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.606212 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:10:19.521390357 +0000 UTC Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.613202 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.613230 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.613238 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.613251 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.613260 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.621399 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.621482 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:20 crc kubenswrapper[4874]: E0126 00:08:20.621621 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.621671 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:20 crc kubenswrapper[4874]: E0126 00:08:20.621718 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:20 crc kubenswrapper[4874]: E0126 00:08:20.621882 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.715471 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.715523 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.715540 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.715565 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.715583 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.818375 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.818419 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.818428 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.818443 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.818453 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.921236 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.921307 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.921323 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.921345 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:20 crc kubenswrapper[4874]: I0126 00:08:20.921361 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:20Z","lastTransitionTime":"2026-01-26T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.025066 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.025122 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.025142 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.025170 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.025187 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.127633 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.127697 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.127715 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.127741 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.127759 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.231559 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.231646 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.231671 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.232169 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.232527 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.336195 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.336556 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.336703 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.336891 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.337122 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.440458 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.440512 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.440529 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.440553 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.440572 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.543752 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.543840 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.543874 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.543906 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.543929 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.606537 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 19:16:27.098272559 +0000 UTC Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.620094 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:21 crc kubenswrapper[4874]: E0126 00:08:21.620318 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.647387 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.647437 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.647455 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.647477 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.647495 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.751220 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.751289 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.751310 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.751340 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.751365 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.854828 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.854928 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.854949 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.855013 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.855033 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.958599 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.958666 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.958683 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.958707 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:21 crc kubenswrapper[4874]: I0126 00:08:21.958722 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:21Z","lastTransitionTime":"2026-01-26T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.062948 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.063108 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.063127 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.063152 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.063172 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.165710 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.165792 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.165822 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.165852 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.165871 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.269311 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.269382 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.269409 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.269627 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.269661 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.373209 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.373586 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.373653 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.373735 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.373803 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.476611 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.476672 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.476691 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.476715 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.476732 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.579911 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.579991 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.580008 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.580034 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.580054 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.607188 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 17:51:02.74589842 +0000 UTC Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.619835 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.619872 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:22 crc kubenswrapper[4874]: E0126 00:08:22.620105 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.619890 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:22 crc kubenswrapper[4874]: E0126 00:08:22.620268 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:22 crc kubenswrapper[4874]: E0126 00:08:22.620523 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.636922 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.683005 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.683338 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.683537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.683721 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.683875 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.789097 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.789482 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.789693 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.789889 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.790382 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.894358 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.894734 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.894912 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.895096 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.895321 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.998720 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.998781 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.998802 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.998831 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:22 crc kubenswrapper[4874]: I0126 00:08:22.998853 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:22Z","lastTransitionTime":"2026-01-26T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.108467 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.108549 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.108570 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.108597 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.108624 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.211910 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.212026 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.212049 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.212076 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.212099 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.351418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.351501 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.351520 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.351549 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.351570 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.454952 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.455030 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.455047 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.455073 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.455091 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.558571 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.558632 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.558648 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.558673 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.558691 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.607737 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 19:19:00.621697811 +0000 UTC Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.620360 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:23 crc kubenswrapper[4874]: E0126 00:08:23.620778 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.662732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.662821 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.662850 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.662885 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.662907 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.767159 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.767286 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.767305 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.767332 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.767351 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.870654 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.870727 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.870746 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.870771 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.870790 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.974106 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.974171 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.974189 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.974217 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:23 crc kubenswrapper[4874]: I0126 00:08:23.974242 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:23Z","lastTransitionTime":"2026-01-26T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.077758 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.077806 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.077824 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.077846 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.077862 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.181720 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.181783 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.181825 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.181850 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.181872 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.284787 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.284839 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.284859 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.284882 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.284899 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.387768 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.387834 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.387857 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.387886 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.387929 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.491464 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.491551 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.491575 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.491612 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.491638 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.594998 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.595063 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.595083 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.595109 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.595130 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.608284 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 05:44:07.91154099 +0000 UTC Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.619957 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.620159 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:24 crc kubenswrapper[4874]: E0126 00:08:24.620366 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.620412 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:24 crc kubenswrapper[4874]: E0126 00:08:24.620598 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:24 crc kubenswrapper[4874]: E0126 00:08:24.620848 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.698558 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.698622 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.698639 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.698662 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.698679 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.802479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.802566 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.802591 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.802622 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.802648 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.906223 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.906277 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.906293 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.906317 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:24 crc kubenswrapper[4874]: I0126 00:08:24.906334 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:24Z","lastTransitionTime":"2026-01-26T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.010451 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.010519 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.010536 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.010563 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.010591 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.114151 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.114212 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.114230 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.114253 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.114272 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.217529 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.217590 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.217606 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.217680 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.217701 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.320679 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.320732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.320750 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.320774 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.320791 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.423681 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.423751 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.423772 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.423802 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.423827 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.527124 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.527218 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.527234 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.527259 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.527276 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.608627 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:15:24.876835895 +0000 UTC Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.620283 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:25 crc kubenswrapper[4874]: E0126 00:08:25.620483 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.629320 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.629375 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.629393 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.629416 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.629436 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.638182 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.658161 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.682296 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.702377 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.727434 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.731819 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.731892 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.731919 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.731949 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.732000 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.748934 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.782402 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.799904 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.820835 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.834851 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.834909 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.834985 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.835014 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.835033 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.838313 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"090de823-350a-4302-a4e8-9816de512911\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3bc10cc7913878e1279288eb135810f2feac44069670c7b2226fa549ab5a45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.858775 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.883587 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.905819 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.922394 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"2026-01-26T00:07:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf\\\\n2026-01-26T00:07:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf to /host/opt/cni/bin/\\\\n2026-01-26T00:07:32Z [verbose] multus-daemon started\\\\n2026-01-26T00:07:32Z [verbose] Readiness Indicator file check\\\\n2026-01-26T00:08:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.938811 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.938918 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.938937 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.938995 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.939014 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:25Z","lastTransitionTime":"2026-01-26T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.948071 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.967686 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.983835 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:25 crc kubenswrapper[4874]: I0126 00:08:25.997186 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:25Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.009629 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:26Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.041904 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.042313 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.042470 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.042621 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.042763 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.145715 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.145787 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.145806 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.145834 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.145854 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.249392 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.249468 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.249486 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.249514 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.249534 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.353345 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.353428 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.353448 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.353477 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.353500 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.456861 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.456916 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.456933 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.457001 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.457020 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.560195 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.560255 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.560272 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.560296 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.560318 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.609114 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 17:49:13.124274302 +0000 UTC Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.619745 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.619918 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:26 crc kubenswrapper[4874]: E0126 00:08:26.620134 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:26 crc kubenswrapper[4874]: E0126 00:08:26.620228 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.619769 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:26 crc kubenswrapper[4874]: E0126 00:08:26.620691 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.663779 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.663852 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.663870 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.663896 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.663915 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.767618 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.767688 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.767705 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.767731 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.767750 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.871443 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.871507 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.871525 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.871551 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.871569 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.974618 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.974851 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.974878 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.974912 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:26 crc kubenswrapper[4874]: I0126 00:08:26.974937 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:26Z","lastTransitionTime":"2026-01-26T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.078838 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.078900 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.078918 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.078945 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.078985 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.182744 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.182808 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.182836 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.182865 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.182887 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.285497 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.285563 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.285581 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.285610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.285628 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.341615 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.341890 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:31.341850637 +0000 UTC m=+146.075957204 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.388506 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.388561 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.388579 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.388604 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.388624 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.443597 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.443732 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.443792 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.443864 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.443864 4874 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.444111 4874 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.444116 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.444169 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.444191 4874 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.444131 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.444327 4874 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.444361 4874 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.445116 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:09:31.444059732 +0000 UTC m=+146.178166309 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.445222 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-26 00:09:31.445177268 +0000 UTC m=+146.179283875 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.445262 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-26 00:09:31.445243489 +0000 UTC m=+146.179350196 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.445315 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-26 00:09:31.445295081 +0000 UTC m=+146.179401788 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.491452 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.491513 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.491537 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.491568 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.491596 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.594318 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.594382 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.594408 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.594441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.594463 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.609684 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 07:24:13.766582715 +0000 UTC Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.620316 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:27 crc kubenswrapper[4874]: E0126 00:08:27.620546 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.697513 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.697586 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.697604 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.697634 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.697659 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.801817 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.801920 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.801938 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.802023 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.802043 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.905453 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.905538 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.905562 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.905596 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:27 crc kubenswrapper[4874]: I0126 00:08:27.905620 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:27Z","lastTransitionTime":"2026-01-26T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.008565 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.008703 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.008726 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.008755 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.009046 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.113023 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.113097 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.113115 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.113149 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.113174 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.216504 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.216568 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.216584 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.216610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.216630 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.320991 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.321084 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.321108 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.321137 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.321155 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.382863 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.382954 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.383009 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.383046 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.383068 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.405528 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:28Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.410474 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.410507 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.410519 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.410535 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.410547 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.424550 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:28Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.428614 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.428665 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.428680 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.428698 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.428711 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.446849 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:28Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.455572 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.455622 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.455636 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.455656 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.455669 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.474270 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:28Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.479326 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.479399 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.479423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.479452 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.479474 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.499242 4874 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e73dcac0-015f-4575-8403-69ce0bd9d896\\\",\\\"systemUUID\\\":\\\"a8c485be-6bf0-44d9-8c05-f7322b3cb7a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:28Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.499391 4874 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.501546 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.501601 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.501624 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.501655 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.501680 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.604610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.604726 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.604749 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.604776 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.604794 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.610505 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 13:59:46.361723533 +0000 UTC Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.620190 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.620199 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.620259 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.620423 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.620517 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:28 crc kubenswrapper[4874]: E0126 00:08:28.620614 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.708436 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.708499 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.708519 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.708544 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.708562 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.812257 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.812335 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.812359 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.812394 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.812421 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.915186 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.915252 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.915270 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.915298 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:28 crc kubenswrapper[4874]: I0126 00:08:28.915319 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:28Z","lastTransitionTime":"2026-01-26T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.018557 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.018616 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.018633 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.018659 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.018679 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.121107 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.121197 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.121210 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.121251 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.121266 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.224285 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.224352 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.224370 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.224396 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.224420 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.329222 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.329277 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.329294 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.329320 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.329339 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.433027 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.433108 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.433130 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.433163 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.433186 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.537165 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.537227 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.537253 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.537283 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.537306 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.611421 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 01:55:17.407544181 +0000 UTC Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.620180 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:29 crc kubenswrapper[4874]: E0126 00:08:29.620409 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.640785 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.640889 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.640922 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.640955 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.641049 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.744675 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.744732 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.744750 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.744776 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.744792 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.847666 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.847730 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.847748 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.847773 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.847791 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.950432 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.950500 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.950524 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.950553 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:29 crc kubenswrapper[4874]: I0126 00:08:29.950578 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:29Z","lastTransitionTime":"2026-01-26T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.053624 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.053699 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.053723 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.053747 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.053766 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.156133 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.156200 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.156226 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.156256 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.156279 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.259275 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.259329 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.259342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.259361 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.259374 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.362548 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.362620 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.362644 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.362677 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.362702 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.465371 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.465444 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.465482 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.465512 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.465534 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.569069 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.569125 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.569157 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.569203 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.569229 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.612507 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 00:47:14.081221043 +0000 UTC Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.620043 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.620171 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.620064 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:30 crc kubenswrapper[4874]: E0126 00:08:30.620306 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:30 crc kubenswrapper[4874]: E0126 00:08:30.620930 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:30 crc kubenswrapper[4874]: E0126 00:08:30.621173 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.621494 4874 scope.go:117] "RemoveContainer" containerID="82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.672303 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.672381 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.672410 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.672441 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.672464 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.775091 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.775541 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.775559 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.775581 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.775599 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.879030 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.879099 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.879122 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.879152 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.879170 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.981777 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.981839 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.981859 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.981884 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:30 crc kubenswrapper[4874]: I0126 00:08:30.981901 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:30Z","lastTransitionTime":"2026-01-26T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.084386 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.084457 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.084476 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.084505 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.084527 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.188739 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.188788 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.188827 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.188844 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.188857 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.291901 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.292031 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.292058 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.292088 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.292114 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.394590 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.394656 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.394670 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.394693 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.394706 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.498563 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.498608 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.498621 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.498638 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.498652 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.602328 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.602396 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.602415 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.602443 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.602466 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.613470 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 06:37:01.263190103 +0000 UTC Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.620040 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:31 crc kubenswrapper[4874]: E0126 00:08:31.620383 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.705777 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.705813 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.705826 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.705846 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.705858 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.808394 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.808440 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.808452 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.808472 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.808486 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.911705 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.911794 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.911818 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.911849 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:31 crc kubenswrapper[4874]: I0126 00:08:31.911872 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:31Z","lastTransitionTime":"2026-01-26T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.014443 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.014508 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.014528 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.014551 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.014564 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.117538 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.117598 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.117616 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.117642 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.117662 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.147132 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/2.log" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.151884 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.152790 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.175348 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.193775 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"090de823-350a-4302-a4e8-9816de512911\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3bc10cc7913878e1279288eb135810f2feac44069670c7b2226fa549ab5a45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.215133 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.220650 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.220747 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.220766 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.220792 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.220809 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.232648 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.258140 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.277127 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.298547 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"2026-01-26T00:07:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf\\\\n2026-01-26T00:07:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf to /host/opt/cni/bin/\\\\n2026-01-26T00:07:32Z [verbose] multus-daemon started\\\\n2026-01-26T00:07:32Z [verbose] Readiness Indicator file check\\\\n2026-01-26T00:08:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.324000 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.324063 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.324088 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.324116 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.324136 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.336212 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.358249 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.376872 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.396805 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.410457 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.427024 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.427087 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.427104 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.427136 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.427154 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.429758 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.456456 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.475926 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.495638 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.514458 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.530153 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.530217 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.530238 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.530260 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.530281 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.546192 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.563852 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.614291 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 13:26:25.899982074 +0000 UTC Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.619719 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.619746 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:32 crc kubenswrapper[4874]: E0126 00:08:32.619896 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.619985 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:32 crc kubenswrapper[4874]: E0126 00:08:32.620133 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:32 crc kubenswrapper[4874]: E0126 00:08:32.620340 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.633677 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.633722 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.633738 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.633759 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.633775 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.736538 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.736654 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.736681 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.736832 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.736862 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.840841 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.840892 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.840907 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.840934 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.840951 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.944240 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.944296 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.944314 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.944337 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:32 crc kubenswrapper[4874]: I0126 00:08:32.944357 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:32Z","lastTransitionTime":"2026-01-26T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.047466 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.047523 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.047540 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.047563 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.047580 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.150765 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.151068 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.151085 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.151104 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.151120 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.159152 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/3.log" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.160474 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/2.log" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.165393 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" exitCode=1 Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.165467 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.165529 4874 scope.go:117] "RemoveContainer" containerID="82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.167245 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:08:33 crc kubenswrapper[4874]: E0126 00:08:33.167732 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.187445 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.205776 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.224285 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.245750 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.254235 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.254272 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.254286 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.254307 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.254325 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.262368 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"090de823-350a-4302-a4e8-9816de512911\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3bc10cc7913878e1279288eb135810f2feac44069670c7b2226fa549ab5a45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.279793 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.300798 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"2026-01-26T00:07:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf\\\\n2026-01-26T00:07:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf to /host/opt/cni/bin/\\\\n2026-01-26T00:07:32Z [verbose] multus-daemon started\\\\n2026-01-26T00:07:32Z [verbose] Readiness Indicator file check\\\\n2026-01-26T00:08:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.319666 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.336559 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.356638 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.356679 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.356690 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.356708 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.356721 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.366004 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.382463 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.417814 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.442721 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.459422 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.459456 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.459464 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.459479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.459489 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.464314 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0d509bc45f092e028831bf74f2fda9a42d01decee5a373ca23deb9f696f4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:03Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419416 6562 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:03.419506 6562 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:03.419877 6562 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:03.419908 6562 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:03.419951 6562 factory.go:656] Stopping watch factory\\\\nI0126 00:08:03.419988 6562 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:03.419996 6562 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:03.438269 6562 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:03.438295 6562 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:03.438376 6562 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:03.438407 6562 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:03.438511 6562 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:32Z\\\",\\\"message\\\":\\\"ory.go:140\\\\nI0126 00:08:32.096841 6965 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:32.096861 6965 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:32.096924 6965 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:32.097794 6965 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:32.097815 6965 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:32.104931 6965 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:32.105099 6965 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:32.105148 6965 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:32.105241 6965 factory.go:656] Stopping watch factory\\\\nI0126 00:08:32.105303 6965 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:32.105240 6965 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:32.105441 6965 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:32.105565 6965 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.478503 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.491096 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.507475 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.517242 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.527798 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:33Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.562701 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.562797 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.562815 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.562841 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.562859 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.614444 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 20:46:12.871282623 +0000 UTC Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.620196 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:33 crc kubenswrapper[4874]: E0126 00:08:33.620456 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.667569 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.667648 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.667672 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.667703 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.667728 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.770418 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.770473 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.770490 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.770511 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.770528 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.873606 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.873672 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.873687 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.873710 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.873729 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.977945 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.978104 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.978124 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.978153 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:33 crc kubenswrapper[4874]: I0126 00:08:33.978175 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:33Z","lastTransitionTime":"2026-01-26T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.082116 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.082177 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.082190 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.082210 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.082221 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.172199 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/3.log" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.178279 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:08:34 crc kubenswrapper[4874]: E0126 00:08:34.178671 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.184645 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.184695 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.184707 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.184723 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.184735 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.203800 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262b9da0-a082-4538-af3b-2c747e51e156\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-26T00:07:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0126 00:07:18.237624 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0126 00:07:18.241161 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3427775462/tls.crt::/tmp/serving-cert-3427775462/tls.key\\\\\\\"\\\\nI0126 00:07:23.465358 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0126 00:07:23.480833 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0126 00:07:23.480870 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0126 00:07:23.480907 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0126 00:07:23.480939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0126 00:07:23.488796 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0126 00:07:23.488839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0126 00:07:23.488850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0126 00:07:23.488850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0126 00:07:23.488859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0126 00:07:23.488871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0126 00:07:23.488879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0126 00:07:23.488888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0126 00:07:23.496678 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.222944 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cd095cb-3b94-4cc8-a0ec-c9d7e88e383e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72ff9d8587eab04da6597fbd32bedb2af3a6474a76c7b49030a44e3e4ccf890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb55268d8e5e28e4b63eef3a382b4c539dbd9bf2eff3a33666291d8f69c90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c172fe6ec8b85a775b7c81307a5b723fa69d97306b14d84da9c8f86e27f9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597c99a6d97e8263c514fa2d13d3c5a455b2af9132d51420a7c7246726e11eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.244782 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.263669 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1fd285d-2d8d-4a57-8afa-450025153e32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ba6184e96babf53991948429005aa011b12a579d8625e443ad37ca113d2c9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bh6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ffp42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.287735 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.287799 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.287816 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.287843 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.287861 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.296181 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79a8f534-7005-4a14-ae01-0a689a541502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:32Z\\\",\\\"message\\\":\\\"ory.go:140\\\\nI0126 00:08:32.096841 6965 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0126 00:08:32.096861 6965 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:32.096924 6965 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0126 00:08:32.097794 6965 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0126 00:08:32.097815 6965 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0126 00:08:32.104931 6965 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0126 00:08:32.105099 6965 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0126 00:08:32.105148 6965 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0126 00:08:32.105241 6965 factory.go:656] Stopping watch factory\\\\nI0126 00:08:32.105303 6965 ovnkube.go:599] Stopped ovnkube\\\\nI0126 00:08:32.105240 6965 handler.go:208] Removed *v1.Node event handler 2\\\\nI0126 00:08:32.105441 6965 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0126 00:08:32.105565 6965 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:08:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52hgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dvmzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.312415 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-m5wsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30567d80-b29a-4ecb-b39a-d97b81664dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c203c95440b0146888f221191743dde7947950f2fd4e2173e8782a1f3b03493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq7tn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-m5wsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.329675 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a83088b-51a6-4ed6-b4e9-5191f0017930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88c238ded677dbefc5bbac29c7ff71356fa3bc88bcfe45699798744dfd204f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508c3d5cdf4bb5e8e1a3955a0ba0a289b59a97b5c6d79619652007879bf11fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svcbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bdngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.350443 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.368098 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"090de823-350a-4302-a4e8-9816de512911\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3bc10cc7913878e1279288eb135810f2feac44069670c7b2226fa549ab5a45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.387862 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.390623 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.390681 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.390701 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.390729 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.390747 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.402602 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.423737 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.440910 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.464227 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"2026-01-26T00:07:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf\\\\n2026-01-26T00:07:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf to /host/opt/cni/bin/\\\\n2026-01-26T00:07:32Z [verbose] multus-daemon started\\\\n2026-01-26T00:07:32Z [verbose] Readiness Indicator file check\\\\n2026-01-26T00:08:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.494834 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.494901 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.494918 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.494947 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.494999 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.500907 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0143dc3-eb0d-47fa-ac05-a1bc98e8cef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d946e54650a608f5e4240df4b928bd52a4bf94049c95fad80a0cdd72c492bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50c5f3a9ffce1e4476a4ed2478c51724884d62f865af7a2a4600ca3b212f7624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e728df45e18fd13390bc916b52a62e3e46c8ce83adfc32d494a9ec2c898a083b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20489bbac9206e736b713d2c570d9be9140975144978613ff6381638899cf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a746f23e56ced4abf213e4f0f6ec9d755ff4d6ae65dc2ee0cae2addc34e3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d0f42fcc6135ffcf0dd848e030ef6a5a88a2e3e3fffa1b5346522828e03ee6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c423bda5f6352da034da160a0efaa24057c2ba9585ce24dda25243b9892e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a2738e42be109e2f1a71bac8ac4de7e8d8100f0e3ced6d5f24ef20fee122b1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.521885 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d1edc671110f58f6ef875d42a500cf14a0a28ffc276ee5cc7a4856d6c254f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.542501 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.561719 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc416509b1cfb88f74a2c1e4ffdb9b7cabd0c8f1dde024905973578bc2f2d04b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.581335 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mmtqg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb749-294e-42b2-952e-c0f4b1611c06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76fdc9571114ddd23ae65889143f9d37550a65855dd1b027559df35dabfb778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2cz8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mmtqg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.597790 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.597850 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.597872 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.597898 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.597915 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.615149 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 04:34:22.90644636 +0000 UTC Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.619760 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.619871 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:34 crc kubenswrapper[4874]: E0126 00:08:34.620064 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.620095 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:34 crc kubenswrapper[4874]: E0126 00:08:34.620218 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:34 crc kubenswrapper[4874]: E0126 00:08:34.620500 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.701479 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.701544 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.701556 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.701578 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.701591 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.804862 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.804932 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.804955 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.805021 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.805043 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.908713 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.908782 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.908800 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.908824 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:34 crc kubenswrapper[4874]: I0126 00:08:34.908842 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:34Z","lastTransitionTime":"2026-01-26T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.012749 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.012820 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.012839 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.012863 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.012881 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.115953 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.116050 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.116067 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.116097 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.116115 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.219749 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.219815 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.219835 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.219861 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.219881 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.323146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.323207 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.323224 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.323251 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.323272 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.426146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.426221 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.426242 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.426272 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.426293 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.530148 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.530211 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.530229 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.530256 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.530278 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.615793 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 21:22:43.363294395 +0000 UTC Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.620295 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:35 crc kubenswrapper[4874]: E0126 00:08:35.620572 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.632868 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.632954 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.633019 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.633050 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.633072 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.637695 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"489242ac-41c4-45f2-9828-58c4012e9f64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vzrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qkq5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.659032 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0c28b66-cea5-44af-b122-081987a189fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf71d445c55f872227db3ac90b77d893cf2b28b860cdfffd34191279d2c133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c078a00492ddb613858f7ae0fcac24fbad00c69d046d1a189e332ccb45861ffa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4a0e7031e0d6e53e36be0a0f4e52a2aba5d1426382ff2fd13ad8bd3b423a9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.676433 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"090de823-350a-4302-a4e8-9816de512911\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3bc10cc7913878e1279288eb135810f2feac44069670c7b2226fa549ab5a45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de0cce962427fba11a1c0084fb8dcb2d3fef977023dca65a9d54685b2754e00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.697439 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8ab7c67129b12fbb807ce63dc410f20169f8950302f169e5df47d12d4e9d506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a25dafb1c12145edfe62b1c6055115dc8a7ac121dc01bc48fe9fe9f7222543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.717810 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.735679 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.735731 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.735764 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.735801 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.735826 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.743734 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3d49871-449f-49f4-a40a-121286b1e929\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0191b75321c90f585f38df5582cea18dab6fa70c8822197f8758fc374334e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e383701536b3cf159c5b3b44e8b3ff5e1f769ace051840a17da2a177ba5cf198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://745d669824693906ac3a30dbbf571cea7f7b6fb74728a25426d98d4062637921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f340f8cc6ef6580a3dcbe87179bcf2043c7daeba0551b9d24bfd7996cfb181\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3e16ddbb03b140ee1af7ab5486285c88b92d293d5a3773ee8e1c0e2db5e105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af9c5208d94b73bba5acbccf3c7f57e2017d858e70180a446b5ca919b58886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774e020a4f817937b605fd428977fe55c81cb46056a022503a1c3122bb4af0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-26T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg5zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q4qrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.762350 4874 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qjn7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23d0d511-4c29-434d-92b1-74ca6909e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-26T00:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-26T00:08:17Z\\\",\\\"message\\\":\\\"2026-01-26T00:07:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf\\\\n2026-01-26T00:07:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_33e034f7-5d71-4103-80e1-6b893a8abecf to /host/opt/cni/bin/\\\\n2026-01-26T00:07:32Z [verbose] multus-daemon started\\\\n2026-01-26T00:07:32Z [verbose] Readiness Indicator file check\\\\n2026-01-26T00:08:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-26T00:07:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-26T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqpcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-26T00:07:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qjn7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-26T00:08:35Z is after 2025-08-24T17:21:41Z" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.838599 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.839066 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.839357 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.839557 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.839694 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.842299 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=73.842262684 podStartE2EDuration="1m13.842262684s" podCreationTimestamp="2026-01-26 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:35.819821764 +0000 UTC m=+90.553928381" watchObservedRunningTime="2026-01-26 00:08:35.842262684 +0000 UTC m=+90.576369261" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.916710 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m5wsd" podStartSLOduration=66.916686122 podStartE2EDuration="1m6.916686122s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:35.916551459 +0000 UTC m=+90.650658036" watchObservedRunningTime="2026-01-26 00:08:35.916686122 +0000 UTC m=+90.650792669" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.917018 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mmtqg" podStartSLOduration=66.9170122 podStartE2EDuration="1m6.9170122s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:35.901613816 +0000 UTC m=+90.635720393" watchObservedRunningTime="2026-01-26 00:08:35.9170122 +0000 UTC m=+90.651118747" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.933203 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bdngr" podStartSLOduration=66.93317858099999 podStartE2EDuration="1m6.933178581s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:35.932732921 +0000 UTC m=+90.666839488" watchObservedRunningTime="2026-01-26 00:08:35.933178581 +0000 UTC m=+90.667285148" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.943217 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.943483 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.943613 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.943740 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.943884 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:35Z","lastTransitionTime":"2026-01-26T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.960306 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.960285572 podStartE2EDuration="1m12.960285572s" podCreationTimestamp="2026-01-26 00:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:35.959707028 +0000 UTC m=+90.693813585" watchObservedRunningTime="2026-01-26 00:08:35.960285572 +0000 UTC m=+90.694392149" Jan 26 00:08:35 crc kubenswrapper[4874]: I0126 00:08:35.990429 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.990406473 podStartE2EDuration="35.990406473s" podCreationTimestamp="2026-01-26 00:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:35.975130562 +0000 UTC m=+90.709237109" watchObservedRunningTime="2026-01-26 00:08:35.990406473 +0000 UTC m=+90.724513010" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.031251 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podStartSLOduration=67.031225237 podStartE2EDuration="1m7.031225237s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:36.003547523 +0000 UTC m=+90.737654090" watchObservedRunningTime="2026-01-26 00:08:36.031225237 +0000 UTC m=+90.765331784" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.052151 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.052223 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.052240 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.052264 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.052281 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.154817 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.154874 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.154884 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.154902 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.154913 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.258314 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.258376 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.258393 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.258419 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.258438 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.362247 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.362331 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.362353 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.362384 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.362493 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.465656 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.465709 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.465726 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.465751 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.465769 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.569150 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.569202 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.569223 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.569249 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.569269 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.616137 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 15:07:24.854775593 +0000 UTC Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.619542 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.619581 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.619586 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:36 crc kubenswrapper[4874]: E0126 00:08:36.619698 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:36 crc kubenswrapper[4874]: E0126 00:08:36.619786 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:36 crc kubenswrapper[4874]: E0126 00:08:36.619889 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.672779 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.672824 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.672842 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.672865 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.672882 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.776524 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.776587 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.776605 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.776631 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.776649 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.880375 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.880436 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.880453 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.880476 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.880492 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.983389 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.983449 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.983476 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.983520 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:36 crc kubenswrapper[4874]: I0126 00:08:36.983543 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:36Z","lastTransitionTime":"2026-01-26T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.086361 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.086455 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.086473 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.086499 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.086517 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.188987 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.189084 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.189114 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.189149 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.189169 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.292180 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.292256 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.292273 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.292299 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.292317 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.395785 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.395854 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.395872 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.395896 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.395915 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.498845 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.498917 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.498935 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.498990 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.499010 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.602300 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.602397 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.602423 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.602462 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.602487 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.616615 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 23:06:00.2597361 +0000 UTC Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.620160 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:37 crc kubenswrapper[4874]: E0126 00:08:37.620398 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.705952 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.706044 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.706064 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.706089 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.706107 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.809342 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.809702 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.809721 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.809746 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.809764 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.913314 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.913410 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.913429 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.913459 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:37 crc kubenswrapper[4874]: I0126 00:08:37.913479 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:37Z","lastTransitionTime":"2026-01-26T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.017369 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.017446 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.017464 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.017492 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.017508 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:38Z","lastTransitionTime":"2026-01-26T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.120521 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.120587 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.120610 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.120637 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.120659 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:38Z","lastTransitionTime":"2026-01-26T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.223232 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.223316 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.223330 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.223371 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.223389 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:38Z","lastTransitionTime":"2026-01-26T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.327076 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.327146 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.327172 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.327205 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.327231 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:38Z","lastTransitionTime":"2026-01-26T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.430149 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.430229 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.430242 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.430260 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.430273 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:38Z","lastTransitionTime":"2026-01-26T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.533790 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.533882 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.533914 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.533952 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.534017 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:38Z","lastTransitionTime":"2026-01-26T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.570258 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.570327 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.570350 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.570378 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.570399 4874 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-26T00:08:38Z","lastTransitionTime":"2026-01-26T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.619445 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:38 crc kubenswrapper[4874]: E0126 00:08:38.619644 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.619691 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:38 crc kubenswrapper[4874]: E0126 00:08:38.620187 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.620184 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 03:26:15.616254559 +0000 UTC Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.620245 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.620456 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:38 crc kubenswrapper[4874]: E0126 00:08:38.620917 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.632660 4874 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.636606 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt"] Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.637366 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.639879 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.640195 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.640379 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.640507 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.655850 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.655817709 podStartE2EDuration="16.655817709s" podCreationTimestamp="2026-01-26 00:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:38.655620704 +0000 UTC m=+93.389727341" watchObservedRunningTime="2026-01-26 00:08:38.655817709 +0000 UTC m=+93.389924296" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.719589 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q4qrv" podStartSLOduration=69.719554804 podStartE2EDuration="1m9.719554804s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:38.718613352 +0000 UTC m=+93.452719969" watchObservedRunningTime="2026-01-26 00:08:38.719554804 +0000 UTC m=+93.453661371" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.756547 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=69.756522087 podStartE2EDuration="1m9.756522087s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:38.755440742 +0000 UTC m=+93.489547309" watchObservedRunningTime="2026-01-26 00:08:38.756522087 +0000 UTC m=+93.490628654" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.778511 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qjn7c" podStartSLOduration=69.778487476 podStartE2EDuration="1m9.778487476s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:38.778035515 +0000 UTC m=+93.512142092" watchObservedRunningTime="2026-01-26 00:08:38.778487476 +0000 UTC m=+93.512594023" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.793936 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1e86ce1b-2ca0-4490-b117-7af8f0545f43-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.794075 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1e86ce1b-2ca0-4490-b117-7af8f0545f43-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.794126 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e86ce1b-2ca0-4490-b117-7af8f0545f43-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.794224 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86ce1b-2ca0-4490-b117-7af8f0545f43-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.794257 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e86ce1b-2ca0-4490-b117-7af8f0545f43-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.895832 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86ce1b-2ca0-4490-b117-7af8f0545f43-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.895890 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e86ce1b-2ca0-4490-b117-7af8f0545f43-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.895937 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1e86ce1b-2ca0-4490-b117-7af8f0545f43-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.896005 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1e86ce1b-2ca0-4490-b117-7af8f0545f43-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.896029 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e86ce1b-2ca0-4490-b117-7af8f0545f43-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.896194 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1e86ce1b-2ca0-4490-b117-7af8f0545f43-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.896234 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1e86ce1b-2ca0-4490-b117-7af8f0545f43-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.897147 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e86ce1b-2ca0-4490-b117-7af8f0545f43-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.905208 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e86ce1b-2ca0-4490-b117-7af8f0545f43-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.922158 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86ce1b-2ca0-4490-b117-7af8f0545f43-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qxwqt\" (UID: \"1e86ce1b-2ca0-4490-b117-7af8f0545f43\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: I0126 00:08:38.964291 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" Jan 26 00:08:38 crc kubenswrapper[4874]: W0126 00:08:38.989404 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e86ce1b_2ca0_4490_b117_7af8f0545f43.slice/crio-126d8f65198a59232fb6b8655c99ca68194606ec877b6bba62507758b4145117 WatchSource:0}: Error finding container 126d8f65198a59232fb6b8655c99ca68194606ec877b6bba62507758b4145117: Status 404 returned error can't find the container with id 126d8f65198a59232fb6b8655c99ca68194606ec877b6bba62507758b4145117 Jan 26 00:08:39 crc kubenswrapper[4874]: I0126 00:08:39.196700 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" event={"ID":"1e86ce1b-2ca0-4490-b117-7af8f0545f43","Type":"ContainerStarted","Data":"be7a7644ddb0432953417d4720504878820d5e255a278e372a8d2b7e7853d856"} Jan 26 00:08:39 crc kubenswrapper[4874]: I0126 00:08:39.197154 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" event={"ID":"1e86ce1b-2ca0-4490-b117-7af8f0545f43","Type":"ContainerStarted","Data":"126d8f65198a59232fb6b8655c99ca68194606ec877b6bba62507758b4145117"} Jan 26 00:08:39 crc kubenswrapper[4874]: I0126 00:08:39.619628 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:39 crc kubenswrapper[4874]: E0126 00:08:39.619881 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:40 crc kubenswrapper[4874]: I0126 00:08:40.620380 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:40 crc kubenswrapper[4874]: I0126 00:08:40.620448 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:40 crc kubenswrapper[4874]: I0126 00:08:40.620383 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:40 crc kubenswrapper[4874]: E0126 00:08:40.620549 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:40 crc kubenswrapper[4874]: E0126 00:08:40.620746 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:40 crc kubenswrapper[4874]: E0126 00:08:40.621120 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:41 crc kubenswrapper[4874]: I0126 00:08:41.619932 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:41 crc kubenswrapper[4874]: E0126 00:08:41.620320 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:42 crc kubenswrapper[4874]: I0126 00:08:42.619604 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:42 crc kubenswrapper[4874]: I0126 00:08:42.619608 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:42 crc kubenswrapper[4874]: E0126 00:08:42.620280 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:42 crc kubenswrapper[4874]: I0126 00:08:42.619683 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:42 crc kubenswrapper[4874]: E0126 00:08:42.620387 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:42 crc kubenswrapper[4874]: E0126 00:08:42.620590 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:43 crc kubenswrapper[4874]: I0126 00:08:43.619614 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:43 crc kubenswrapper[4874]: E0126 00:08:43.619844 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:44 crc kubenswrapper[4874]: I0126 00:08:44.619811 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:44 crc kubenswrapper[4874]: I0126 00:08:44.619861 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:44 crc kubenswrapper[4874]: I0126 00:08:44.619811 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:44 crc kubenswrapper[4874]: E0126 00:08:44.620023 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:44 crc kubenswrapper[4874]: E0126 00:08:44.620147 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:44 crc kubenswrapper[4874]: E0126 00:08:44.620256 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:45 crc kubenswrapper[4874]: I0126 00:08:45.619751 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:45 crc kubenswrapper[4874]: E0126 00:08:45.621853 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:46 crc kubenswrapper[4874]: I0126 00:08:46.619994 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:46 crc kubenswrapper[4874]: I0126 00:08:46.620084 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:46 crc kubenswrapper[4874]: E0126 00:08:46.620202 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:46 crc kubenswrapper[4874]: I0126 00:08:46.620241 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:46 crc kubenswrapper[4874]: E0126 00:08:46.620425 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:46 crc kubenswrapper[4874]: E0126 00:08:46.620644 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:47 crc kubenswrapper[4874]: I0126 00:08:47.620341 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:47 crc kubenswrapper[4874]: E0126 00:08:47.620559 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:47 crc kubenswrapper[4874]: I0126 00:08:47.621702 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:08:47 crc kubenswrapper[4874]: E0126 00:08:47.622007 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:08:48 crc kubenswrapper[4874]: I0126 00:08:48.207156 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:48 crc kubenswrapper[4874]: E0126 00:08:48.207369 4874 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:08:48 crc kubenswrapper[4874]: E0126 00:08:48.207482 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs podName:489242ac-41c4-45f2-9828-58c4012e9f64 nodeName:}" failed. No retries permitted until 2026-01-26 00:09:52.207444333 +0000 UTC m=+166.941550900 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs") pod "network-metrics-daemon-qkq5t" (UID: "489242ac-41c4-45f2-9828-58c4012e9f64") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 26 00:08:48 crc kubenswrapper[4874]: I0126 00:08:48.619554 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:48 crc kubenswrapper[4874]: I0126 00:08:48.619699 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:48 crc kubenswrapper[4874]: E0126 00:08:48.619759 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:48 crc kubenswrapper[4874]: I0126 00:08:48.619555 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:48 crc kubenswrapper[4874]: E0126 00:08:48.619890 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:48 crc kubenswrapper[4874]: E0126 00:08:48.620094 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:49 crc kubenswrapper[4874]: I0126 00:08:49.620242 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:49 crc kubenswrapper[4874]: E0126 00:08:49.620485 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:50 crc kubenswrapper[4874]: I0126 00:08:50.619427 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:50 crc kubenswrapper[4874]: I0126 00:08:50.619551 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:50 crc kubenswrapper[4874]: I0126 00:08:50.619552 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:50 crc kubenswrapper[4874]: E0126 00:08:50.619695 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:50 crc kubenswrapper[4874]: E0126 00:08:50.619865 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:50 crc kubenswrapper[4874]: E0126 00:08:50.620301 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:51 crc kubenswrapper[4874]: I0126 00:08:51.619835 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:51 crc kubenswrapper[4874]: E0126 00:08:51.620474 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:52 crc kubenswrapper[4874]: I0126 00:08:52.620354 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:52 crc kubenswrapper[4874]: E0126 00:08:52.620535 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:52 crc kubenswrapper[4874]: I0126 00:08:52.620376 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:52 crc kubenswrapper[4874]: E0126 00:08:52.620657 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:52 crc kubenswrapper[4874]: I0126 00:08:52.620359 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:52 crc kubenswrapper[4874]: E0126 00:08:52.620758 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:53 crc kubenswrapper[4874]: I0126 00:08:53.620180 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:53 crc kubenswrapper[4874]: E0126 00:08:53.620379 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:54 crc kubenswrapper[4874]: I0126 00:08:54.619907 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:54 crc kubenswrapper[4874]: I0126 00:08:54.620004 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:54 crc kubenswrapper[4874]: I0126 00:08:54.620099 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:54 crc kubenswrapper[4874]: E0126 00:08:54.620144 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:54 crc kubenswrapper[4874]: E0126 00:08:54.620244 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:54 crc kubenswrapper[4874]: E0126 00:08:54.620362 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:55 crc kubenswrapper[4874]: I0126 00:08:55.619823 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:55 crc kubenswrapper[4874]: E0126 00:08:55.622268 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:56 crc kubenswrapper[4874]: I0126 00:08:56.620200 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:56 crc kubenswrapper[4874]: I0126 00:08:56.620216 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:56 crc kubenswrapper[4874]: I0126 00:08:56.620364 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:56 crc kubenswrapper[4874]: E0126 00:08:56.620519 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:56 crc kubenswrapper[4874]: E0126 00:08:56.620778 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:56 crc kubenswrapper[4874]: E0126 00:08:56.620909 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:57 crc kubenswrapper[4874]: I0126 00:08:57.619645 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:57 crc kubenswrapper[4874]: E0126 00:08:57.619880 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:08:58 crc kubenswrapper[4874]: I0126 00:08:58.620203 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:08:58 crc kubenswrapper[4874]: E0126 00:08:58.620391 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:08:58 crc kubenswrapper[4874]: I0126 00:08:58.620787 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:08:58 crc kubenswrapper[4874]: E0126 00:08:58.620918 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:08:58 crc kubenswrapper[4874]: I0126 00:08:58.621255 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:08:58 crc kubenswrapper[4874]: E0126 00:08:58.621383 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:08:59 crc kubenswrapper[4874]: I0126 00:08:59.620060 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:08:59 crc kubenswrapper[4874]: E0126 00:08:59.620629 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:00 crc kubenswrapper[4874]: I0126 00:09:00.619860 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:00 crc kubenswrapper[4874]: I0126 00:09:00.619896 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:00 crc kubenswrapper[4874]: I0126 00:09:00.620433 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:00 crc kubenswrapper[4874]: E0126 00:09:00.620567 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:00 crc kubenswrapper[4874]: E0126 00:09:00.620683 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:00 crc kubenswrapper[4874]: E0126 00:09:00.620789 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:00 crc kubenswrapper[4874]: I0126 00:09:00.620943 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:09:00 crc kubenswrapper[4874]: E0126 00:09:00.621230 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dvmzv_openshift-ovn-kubernetes(79a8f534-7005-4a14-ae01-0a689a541502)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" Jan 26 00:09:01 crc kubenswrapper[4874]: I0126 00:09:01.620226 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:01 crc kubenswrapper[4874]: E0126 00:09:01.620468 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:02 crc kubenswrapper[4874]: I0126 00:09:02.619940 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:02 crc kubenswrapper[4874]: I0126 00:09:02.620050 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:02 crc kubenswrapper[4874]: I0126 00:09:02.620097 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:02 crc kubenswrapper[4874]: E0126 00:09:02.620268 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:02 crc kubenswrapper[4874]: E0126 00:09:02.620399 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:02 crc kubenswrapper[4874]: E0126 00:09:02.620533 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:03 crc kubenswrapper[4874]: I0126 00:09:03.620332 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:03 crc kubenswrapper[4874]: E0126 00:09:03.620571 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.302223 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/1.log" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.302954 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/0.log" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.303053 4874 generic.go:334] "Generic (PLEG): container finished" podID="23d0d511-4c29-434d-92b1-74ca6909e53f" containerID="5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588" exitCode=1 Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.303097 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerDied","Data":"5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588"} Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.303148 4874 scope.go:117] "RemoveContainer" containerID="78c9762931a0582c11ed1837ae5ebf281d62d9f30325864a643a91833a4c8154" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.303821 4874 scope.go:117] "RemoveContainer" containerID="5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588" Jan 26 00:09:04 crc kubenswrapper[4874]: E0126 00:09:04.304136 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qjn7c_openshift-multus(23d0d511-4c29-434d-92b1-74ca6909e53f)\"" pod="openshift-multus/multus-qjn7c" podUID="23d0d511-4c29-434d-92b1-74ca6909e53f" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.364820 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qxwqt" podStartSLOduration=95.364799951 podStartE2EDuration="1m35.364799951s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:08:39.216932992 +0000 UTC m=+93.951039599" watchObservedRunningTime="2026-01-26 00:09:04.364799951 +0000 UTC m=+119.098906498" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.619671 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.619681 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:04 crc kubenswrapper[4874]: E0126 00:09:04.619897 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:04 crc kubenswrapper[4874]: I0126 00:09:04.619681 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:04 crc kubenswrapper[4874]: E0126 00:09:04.620109 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:04 crc kubenswrapper[4874]: E0126 00:09:04.620149 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:05 crc kubenswrapper[4874]: I0126 00:09:05.309047 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/1.log" Jan 26 00:09:05 crc kubenswrapper[4874]: E0126 00:09:05.611381 4874 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 26 00:09:05 crc kubenswrapper[4874]: I0126 00:09:05.620205 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:05 crc kubenswrapper[4874]: E0126 00:09:05.622196 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:05 crc kubenswrapper[4874]: E0126 00:09:05.724524 4874 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 26 00:09:06 crc kubenswrapper[4874]: I0126 00:09:06.620400 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:06 crc kubenswrapper[4874]: I0126 00:09:06.620400 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:06 crc kubenswrapper[4874]: I0126 00:09:06.620598 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:06 crc kubenswrapper[4874]: E0126 00:09:06.620819 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:06 crc kubenswrapper[4874]: E0126 00:09:06.621476 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:06 crc kubenswrapper[4874]: E0126 00:09:06.621320 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:07 crc kubenswrapper[4874]: I0126 00:09:07.620585 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:07 crc kubenswrapper[4874]: E0126 00:09:07.620823 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:08 crc kubenswrapper[4874]: I0126 00:09:08.619678 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:08 crc kubenswrapper[4874]: I0126 00:09:08.619740 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:08 crc kubenswrapper[4874]: I0126 00:09:08.619755 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:08 crc kubenswrapper[4874]: E0126 00:09:08.619933 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:08 crc kubenswrapper[4874]: E0126 00:09:08.620130 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:08 crc kubenswrapper[4874]: E0126 00:09:08.620246 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:09 crc kubenswrapper[4874]: I0126 00:09:09.620517 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:09 crc kubenswrapper[4874]: E0126 00:09:09.620778 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:10 crc kubenswrapper[4874]: I0126 00:09:10.620073 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:10 crc kubenswrapper[4874]: E0126 00:09:10.620233 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:10 crc kubenswrapper[4874]: I0126 00:09:10.620431 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:10 crc kubenswrapper[4874]: E0126 00:09:10.620708 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:10 crc kubenswrapper[4874]: I0126 00:09:10.621560 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:10 crc kubenswrapper[4874]: E0126 00:09:10.621938 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:10 crc kubenswrapper[4874]: E0126 00:09:10.726280 4874 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 26 00:09:11 crc kubenswrapper[4874]: I0126 00:09:11.619632 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:11 crc kubenswrapper[4874]: E0126 00:09:11.619870 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:12 crc kubenswrapper[4874]: I0126 00:09:12.619940 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:12 crc kubenswrapper[4874]: I0126 00:09:12.619940 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:12 crc kubenswrapper[4874]: E0126 00:09:12.620583 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:12 crc kubenswrapper[4874]: I0126 00:09:12.620083 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:12 crc kubenswrapper[4874]: E0126 00:09:12.620776 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:12 crc kubenswrapper[4874]: E0126 00:09:12.620939 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:13 crc kubenswrapper[4874]: I0126 00:09:13.619572 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:13 crc kubenswrapper[4874]: E0126 00:09:13.619854 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:14 crc kubenswrapper[4874]: I0126 00:09:14.620461 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:14 crc kubenswrapper[4874]: I0126 00:09:14.620537 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:14 crc kubenswrapper[4874]: I0126 00:09:14.620466 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:14 crc kubenswrapper[4874]: E0126 00:09:14.621260 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:14 crc kubenswrapper[4874]: E0126 00:09:14.621659 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:14 crc kubenswrapper[4874]: E0126 00:09:14.621463 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:14 crc kubenswrapper[4874]: I0126 00:09:14.622035 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:09:15 crc kubenswrapper[4874]: I0126 00:09:15.350007 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/3.log" Jan 26 00:09:15 crc kubenswrapper[4874]: I0126 00:09:15.353521 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerStarted","Data":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} Jan 26 00:09:15 crc kubenswrapper[4874]: I0126 00:09:15.354252 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:09:15 crc kubenswrapper[4874]: I0126 00:09:15.391225 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podStartSLOduration=106.391190532 podStartE2EDuration="1m46.391190532s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:15.386311627 +0000 UTC m=+130.120418204" watchObservedRunningTime="2026-01-26 00:09:15.391190532 +0000 UTC m=+130.125297109" Jan 26 00:09:15 crc kubenswrapper[4874]: I0126 00:09:15.610139 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qkq5t"] Jan 26 00:09:15 crc kubenswrapper[4874]: I0126 00:09:15.610289 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:15 crc kubenswrapper[4874]: E0126 00:09:15.610391 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:15 crc kubenswrapper[4874]: I0126 00:09:15.621918 4874 scope.go:117] "RemoveContainer" containerID="5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588" Jan 26 00:09:15 crc kubenswrapper[4874]: E0126 00:09:15.728487 4874 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 26 00:09:16 crc kubenswrapper[4874]: I0126 00:09:16.359267 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/1.log" Jan 26 00:09:16 crc kubenswrapper[4874]: I0126 00:09:16.359393 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerStarted","Data":"8c5547c0864eeb4449f5d18919847bde87431e8a6f0a19ace3ce338c4d26312f"} Jan 26 00:09:16 crc kubenswrapper[4874]: I0126 00:09:16.619657 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:16 crc kubenswrapper[4874]: I0126 00:09:16.619726 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:16 crc kubenswrapper[4874]: I0126 00:09:16.619751 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:16 crc kubenswrapper[4874]: E0126 00:09:16.619822 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:16 crc kubenswrapper[4874]: E0126 00:09:16.619995 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:16 crc kubenswrapper[4874]: E0126 00:09:16.620056 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:17 crc kubenswrapper[4874]: I0126 00:09:17.619901 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:17 crc kubenswrapper[4874]: E0126 00:09:17.620132 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:18 crc kubenswrapper[4874]: I0126 00:09:18.619650 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:18 crc kubenswrapper[4874]: I0126 00:09:18.619742 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:18 crc kubenswrapper[4874]: I0126 00:09:18.619834 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:18 crc kubenswrapper[4874]: E0126 00:09:18.619831 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:18 crc kubenswrapper[4874]: E0126 00:09:18.620050 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:18 crc kubenswrapper[4874]: E0126 00:09:18.620489 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:19 crc kubenswrapper[4874]: I0126 00:09:19.620265 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:19 crc kubenswrapper[4874]: E0126 00:09:19.620456 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkq5t" podUID="489242ac-41c4-45f2-9828-58c4012e9f64" Jan 26 00:09:20 crc kubenswrapper[4874]: I0126 00:09:20.620462 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:20 crc kubenswrapper[4874]: I0126 00:09:20.620559 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:20 crc kubenswrapper[4874]: E0126 00:09:20.620680 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 26 00:09:20 crc kubenswrapper[4874]: I0126 00:09:20.621224 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:20 crc kubenswrapper[4874]: E0126 00:09:20.621256 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 26 00:09:20 crc kubenswrapper[4874]: E0126 00:09:20.621442 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 26 00:09:21 crc kubenswrapper[4874]: I0126 00:09:21.619847 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:21 crc kubenswrapper[4874]: I0126 00:09:21.625847 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 26 00:09:21 crc kubenswrapper[4874]: I0126 00:09:21.626155 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 26 00:09:22 crc kubenswrapper[4874]: I0126 00:09:22.620362 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:22 crc kubenswrapper[4874]: I0126 00:09:22.620464 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:22 crc kubenswrapper[4874]: I0126 00:09:22.620384 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:22 crc kubenswrapper[4874]: I0126 00:09:22.623887 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 26 00:09:22 crc kubenswrapper[4874]: I0126 00:09:22.627377 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 26 00:09:22 crc kubenswrapper[4874]: I0126 00:09:22.627450 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 26 00:09:22 crc kubenswrapper[4874]: I0126 00:09:22.630175 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.738857 4874 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.788289 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d56td"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.790445 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.793172 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pns4h"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.793804 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.795450 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.797120 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-m97d7"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.797562 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.799724 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.800619 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.802326 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.804871 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.806691 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.806870 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.806976 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.807209 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.807422 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.807901 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.808740 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7wd7t"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.809333 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.810140 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.811341 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.812242 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.814005 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.814531 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.814802 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.814902 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.815428 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.815531 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.816190 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.817074 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.817201 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.823813 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.823866 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.824132 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.824659 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.825087 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.825560 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.825836 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8k592"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.826054 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.826609 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.826928 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.828373 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.828711 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.828879 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.829131 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.829254 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.829447 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.829534 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.829615 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.829459 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.830000 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.834824 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.834942 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.834956 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.835172 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.835196 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.835551 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.835875 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.836253 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.836703 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.836834 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.837046 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.837105 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.837225 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.837303 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.837332 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.838637 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.838841 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.841042 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29489760-gbhtf"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.842014 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.842417 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.842828 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843057 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843162 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843371 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843466 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843570 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843638 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843707 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843812 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843891 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.844040 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.843425 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.844141 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.844319 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.847715 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.857203 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.859839 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.860678 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863534 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863590 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863622 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xqz88"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863662 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-encryption-config\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863696 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-config\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863722 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960d80ae-7893-4dde-a757-fb45b8725c8e-serving-cert\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863780 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gtb7\" (UniqueName: \"kubernetes.io/projected/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-kube-api-access-2gtb7\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863806 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-audit\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863900 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-client-ca\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.863944 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864055 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntmq\" (UniqueName: \"kubernetes.io/projected/960d80ae-7893-4dde-a757-fb45b8725c8e-kube-api-access-gntmq\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864088 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-etcd-serving-ca\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864165 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ltk\" (UniqueName: \"kubernetes.io/projected/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-kube-api-access-j4ltk\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864200 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864226 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-service-ca-bundle\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864248 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-etcd-client\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864273 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864313 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7adc2094-bd76-49f1-a373-5cbe2212ec76-audit-dir\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864340 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-config\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864470 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-image-import-ca\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864495 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcsjg\" (UniqueName: \"kubernetes.io/projected/7adc2094-bd76-49f1-a373-5cbe2212ec76-kube-api-access-zcsjg\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864550 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-serving-cert\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864585 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8077625e-f489-46b7-9b01-66a13c82f649-images\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864613 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/83681e9d-cef1-4622-a31a-78d0db757f03-machine-approver-tls\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864647 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8863b23d-8bed-4a8c-af20-ce709512fc8b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864677 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzc9\" (UniqueName: \"kubernetes.io/projected/8077625e-f489-46b7-9b01-66a13c82f649-kube-api-access-kzzc9\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864735 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83681e9d-cef1-4622-a31a-78d0db757f03-config\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864768 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864797 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-etcd-client\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864820 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff24fe21-0695-45eb-a15b-150aa14b1a58-audit-dir\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864858 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7adc2094-bd76-49f1-a373-5cbe2212ec76-node-pullsecrets\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864887 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8077625e-f489-46b7-9b01-66a13c82f649-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864913 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8077625e-f489-46b7-9b01-66a13c82f649-config\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864940 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppvh\" (UniqueName: \"kubernetes.io/projected/83681e9d-cef1-4622-a31a-78d0db757f03-kube-api-access-nppvh\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.864984 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8863b23d-8bed-4a8c-af20-ce709512fc8b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865016 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qj5j\" (UniqueName: \"kubernetes.io/projected/8863b23d-8bed-4a8c-af20-ce709512fc8b-kube-api-access-8qj5j\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865112 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-config\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865155 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-client-ca\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865313 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-audit-policies\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865343 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-config\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865573 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83681e9d-cef1-4622-a31a-78d0db757f03-auth-proxy-config\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865597 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2l97\" (UniqueName: \"kubernetes.io/projected/ff24fe21-0695-45eb-a15b-150aa14b1a58-kube-api-access-r2l97\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865693 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-serving-cert\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865734 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865727 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-serving-cert\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.865884 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-encryption-config\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.879111 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.879519 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.880478 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.880706 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.880824 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lzswr"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.883123 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.884022 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.885381 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-s6njc"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.885624 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ks9sj"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.886030 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ks9sj" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.886853 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.886853 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.887060 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.888622 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.890981 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.891212 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.891307 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.892089 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.892095 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.892778 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.893340 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mcbjn"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.893530 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.893602 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.893845 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.893992 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.894292 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.894442 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.894674 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.894765 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.894860 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.894920 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.895002 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.895517 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.895546 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.895720 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.895993 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.896075 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.896182 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.896206 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.896260 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.896379 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.896409 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.899648 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.900397 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.900554 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nmgqb"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.900799 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.900821 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.900915 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.901272 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.903926 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.905582 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.905973 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.908165 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.908258 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tmxht"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.908679 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.909041 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.917681 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-m97d7"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.917865 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.920259 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.924990 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pns4h"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.925231 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.929905 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ndg7w"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.932179 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7wd7t"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.932457 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.960432 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.961022 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.966786 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.967433 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968293 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968329 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-service-ca-bundle\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968356 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-etcd-client\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968378 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968399 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7adc2094-bd76-49f1-a373-5cbe2212ec76-audit-dir\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968423 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-image-import-ca\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968441 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcsjg\" (UniqueName: \"kubernetes.io/projected/7adc2094-bd76-49f1-a373-5cbe2212ec76-kube-api-access-zcsjg\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968527 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-ca\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968553 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4kr\" (UniqueName: \"kubernetes.io/projected/92ad92d9-cb0d-4c2a-a19d-e627164e297a-kube-api-access-cx4kr\") pod \"downloads-7954f5f757-ks9sj\" (UID: \"92ad92d9-cb0d-4c2a-a19d-e627164e297a\") " pod="openshift-console/downloads-7954f5f757-ks9sj" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968579 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-config\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968599 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-serving-cert\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968626 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e058cc-ca8f-4856-95d1-37b33389f471-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968657 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/83681e9d-cef1-4622-a31a-78d0db757f03-machine-approver-tls\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968656 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968681 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8077625e-f489-46b7-9b01-66a13c82f649-images\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968704 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzc9\" (UniqueName: \"kubernetes.io/projected/8077625e-f489-46b7-9b01-66a13c82f649-kube-api-access-kzzc9\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968723 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8863b23d-8bed-4a8c-af20-ce709512fc8b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968746 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83681e9d-cef1-4622-a31a-78d0db757f03-config\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968769 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb7fefa5-3d99-467d-b674-a013b8d2374f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968790 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptwt\" (UniqueName: \"kubernetes.io/projected/cb7fefa5-3d99-467d-b674-a013b8d2374f-kube-api-access-fptwt\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968819 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-serving-cert\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968842 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968883 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff24fe21-0695-45eb-a15b-150aa14b1a58-audit-dir\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968905 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7adc2094-bd76-49f1-a373-5cbe2212ec76-node-pullsecrets\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968927 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8077625e-f489-46b7-9b01-66a13c82f649-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968948 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-etcd-client\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.968987 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8077625e-f489-46b7-9b01-66a13c82f649-config\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969007 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-service-ca\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969026 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4lt\" (UniqueName: \"kubernetes.io/projected/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-kube-api-access-xf4lt\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969046 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nppvh\" (UniqueName: \"kubernetes.io/projected/83681e9d-cef1-4622-a31a-78d0db757f03-kube-api-access-nppvh\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969069 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8863b23d-8bed-4a8c-af20-ce709512fc8b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969090 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-stats-auth\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969109 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qj5j\" (UniqueName: \"kubernetes.io/projected/8863b23d-8bed-4a8c-af20-ce709512fc8b-kube-api-access-8qj5j\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969147 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa444927-3d15-42ba-b2aa-0ea838d8726a-service-ca-bundle\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969182 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-config\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969226 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-client-ca\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969254 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e058cc-ca8f-4856-95d1-37b33389f471-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969266 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969288 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mkq\" (UniqueName: \"kubernetes.io/projected/1413e57b-4a4d-4ae9-902f-a1e5fc080588-kube-api-access-x5mkq\") pod \"image-pruner-29489760-gbhtf\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969322 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-config\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969354 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-audit-policies\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969376 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83681e9d-cef1-4622-a31a-78d0db757f03-auth-proxy-config\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969401 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb7fefa5-3d99-467d-b674-a013b8d2374f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969427 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2l97\" (UniqueName: \"kubernetes.io/projected/ff24fe21-0695-45eb-a15b-150aa14b1a58-kube-api-access-r2l97\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969449 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-serving-cert\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969473 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-serving-cert\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969499 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-encryption-config\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969522 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1413e57b-4a4d-4ae9-902f-a1e5fc080588-serviceca\") pod \"image-pruner-29489760-gbhtf\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969551 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x85zc\" (UniqueName: \"kubernetes.io/projected/aa444927-3d15-42ba-b2aa-0ea838d8726a-kube-api-access-x85zc\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969574 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969597 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb7fefa5-3d99-467d-b674-a013b8d2374f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969623 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969644 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-encryption-config\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969666 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-config\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969691 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gtb7\" (UniqueName: \"kubernetes.io/projected/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-kube-api-access-2gtb7\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969711 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-audit\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969731 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960d80ae-7893-4dde-a757-fb45b8725c8e-serving-cert\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969752 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-client-ca\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969784 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969807 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gntmq\" (UniqueName: \"kubernetes.io/projected/960d80ae-7893-4dde-a757-fb45b8725c8e-kube-api-access-gntmq\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969828 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-metrics-certs\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969847 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-client\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969870 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-etcd-serving-ca\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969892 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-default-certificate\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969913 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrqp\" (UniqueName: \"kubernetes.io/projected/6a6a602f-63bf-44ed-bb17-f1965f4be17b-kube-api-access-cjrqp\") pod \"migrator-59844c95c7-zd6c2\" (UID: \"6a6a602f-63bf-44ed-bb17-f1965f4be17b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969932 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-config\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.969986 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjkm\" (UniqueName: \"kubernetes.io/projected/d3e058cc-ca8f-4856-95d1-37b33389f471-kube-api-access-tdjkm\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.970014 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ltk\" (UniqueName: \"kubernetes.io/projected/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-kube-api-access-j4ltk\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.970544 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb"] Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.970921 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.971092 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.972549 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-service-ca-bundle\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.975321 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7adc2094-bd76-49f1-a373-5cbe2212ec76-audit-dir\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.975330 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.976373 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-image-import-ca\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.976696 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.977847 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-config\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.979186 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-config\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.986434 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-client-ca\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.987114 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff24fe21-0695-45eb-a15b-150aa14b1a58-audit-policies\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.987225 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.987801 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83681e9d-cef1-4622-a31a-78d0db757f03-auth-proxy-config\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.988368 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8077625e-f489-46b7-9b01-66a13c82f649-images\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.988407 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-audit\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.989036 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-config\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.989039 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-client-ca\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.989798 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8863b23d-8bed-4a8c-af20-ce709512fc8b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.995716 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff24fe21-0695-45eb-a15b-150aa14b1a58-audit-dir\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.995785 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83681e9d-cef1-4622-a31a-78d0db757f03-config\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.996253 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.996348 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-etcd-serving-ca\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.996500 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7adc2094-bd76-49f1-a373-5cbe2212ec76-node-pullsecrets\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.997132 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7adc2094-bd76-49f1-a373-5cbe2212ec76-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.997943 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.998017 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8077625e-f489-46b7-9b01-66a13c82f649-config\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.998425 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-config\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:29 crc kubenswrapper[4874]: I0126 00:09:29.998577 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.000859 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.001196 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.001894 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.002344 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8ffn"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.002843 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.003428 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.003665 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.008521 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-etcd-client\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.009708 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8077625e-f489-46b7-9b01-66a13c82f649-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.011050 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.011590 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.012615 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-serving-cert\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.012752 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-serving-cert\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.012815 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8863b23d-8bed-4a8c-af20-ce709512fc8b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.012818 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff24fe21-0695-45eb-a15b-150aa14b1a58-encryption-config\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.013117 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-etcd-client\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.013312 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/83681e9d-cef1-4622-a31a-78d0db757f03-machine-approver-tls\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.013991 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960d80ae-7893-4dde-a757-fb45b8725c8e-serving-cert\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.015818 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.015866 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.016466 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.016736 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.016827 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.016825 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-encryption-config\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.017088 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.017608 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.018059 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.018594 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7adc2094-bd76-49f1-a373-5cbe2212ec76-serving-cert\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.018707 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.020569 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p65x9"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.021326 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.021830 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.024394 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.025349 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.027861 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.029680 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d56td"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.030949 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x7ct8"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.031588 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.033387 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.033709 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.033736 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.034541 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.034633 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.036314 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mcbjn"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.038601 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.043270 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.043333 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.043344 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.044880 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nmgqb"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.047455 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xqz88"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.047488 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lzswr"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.049637 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8k592"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.053452 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.053693 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29489760-gbhtf"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.057595 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.059235 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ks9sj"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.063038 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-s6njc"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.066817 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.070935 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e058cc-ca8f-4856-95d1-37b33389f471-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071179 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb7fefa5-3d99-467d-b674-a013b8d2374f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071210 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptwt\" (UniqueName: \"kubernetes.io/projected/cb7fefa5-3d99-467d-b674-a013b8d2374f-kube-api-access-fptwt\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071226 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-serving-cert\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071249 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-service-ca\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071266 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4lt\" (UniqueName: \"kubernetes.io/projected/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-kube-api-access-xf4lt\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071292 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-stats-auth\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071319 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa444927-3d15-42ba-b2aa-0ea838d8726a-service-ca-bundle\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071358 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e058cc-ca8f-4856-95d1-37b33389f471-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071581 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mkq\" (UniqueName: \"kubernetes.io/projected/1413e57b-4a4d-4ae9-902f-a1e5fc080588-kube-api-access-x5mkq\") pod \"image-pruner-29489760-gbhtf\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071629 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb7fefa5-3d99-467d-b674-a013b8d2374f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071650 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1413e57b-4a4d-4ae9-902f-a1e5fc080588-serviceca\") pod \"image-pruner-29489760-gbhtf\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071677 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x85zc\" (UniqueName: \"kubernetes.io/projected/aa444927-3d15-42ba-b2aa-0ea838d8726a-kube-api-access-x85zc\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071695 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb7fefa5-3d99-467d-b674-a013b8d2374f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071740 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-metrics-certs\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071757 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-client\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071776 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-default-certificate\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071800 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrqp\" (UniqueName: \"kubernetes.io/projected/6a6a602f-63bf-44ed-bb17-f1965f4be17b-kube-api-access-cjrqp\") pod \"migrator-59844c95c7-zd6c2\" (UID: \"6a6a602f-63bf-44ed-bb17-f1965f4be17b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071819 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-config\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071834 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjkm\" (UniqueName: \"kubernetes.io/projected/d3e058cc-ca8f-4856-95d1-37b33389f471-kube-api-access-tdjkm\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071864 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-ca\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.071880 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4kr\" (UniqueName: \"kubernetes.io/projected/92ad92d9-cb0d-4c2a-a19d-e627164e297a-kube-api-access-cx4kr\") pod \"downloads-7954f5f757-ks9sj\" (UID: \"92ad92d9-cb0d-4c2a-a19d-e627164e297a\") " pod="openshift-console/downloads-7954f5f757-ks9sj" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.074138 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-config\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.075650 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-ca\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.076136 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-client\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.076765 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-etcd-service-ca\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.076763 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1413e57b-4a4d-4ae9-902f-a1e5fc080588-serviceca\") pod \"image-pruner-29489760-gbhtf\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.077086 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.079475 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.079913 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.083007 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-serving-cert\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.091065 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.093035 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.095598 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4rsrb"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.096732 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.098788 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k7t65"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.099867 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.101123 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.103456 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.107321 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.109162 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4rsrb"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.112473 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.112635 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.114629 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.116421 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.117576 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p65x9"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.122625 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ndg7w"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.124773 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k7t65"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.125184 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.127089 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.127205 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8ffn"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.130015 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wp989"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.131262 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.131310 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wp989"] Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.131517 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wp989" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.136197 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.153734 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.173015 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.193426 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.220160 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.225069 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb7fefa5-3d99-467d-b674-a013b8d2374f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.232695 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.252484 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.273227 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.292756 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.313874 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.332914 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.336319 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb7fefa5-3d99-467d-b674-a013b8d2374f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.353758 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.372944 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.392911 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.397883 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-default-certificate\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.412526 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.424895 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-stats-auth\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.432743 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.437345 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa444927-3d15-42ba-b2aa-0ea838d8726a-metrics-certs\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.452787 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.462738 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa444927-3d15-42ba-b2aa-0ea838d8726a-service-ca-bundle\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.472612 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.493177 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.513892 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.534308 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.564283 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.573633 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.593357 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.613379 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.632674 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.649229 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e058cc-ca8f-4856-95d1-37b33389f471-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.653098 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.662337 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e058cc-ca8f-4856-95d1-37b33389f471-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.673098 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.714011 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.754099 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.773693 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.793247 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.813756 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.833556 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.853619 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.875089 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.895060 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.914051 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.950302 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ltk\" (UniqueName: \"kubernetes.io/projected/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-kube-api-access-j4ltk\") pod \"route-controller-manager-6576b87f9c-9rrm6\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.953450 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.971830 4874 request.go:700] Waited for 1.000497727s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.974288 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 26 00:09:30 crc kubenswrapper[4874]: I0126 00:09:30.993767 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.037145 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcsjg\" (UniqueName: \"kubernetes.io/projected/7adc2094-bd76-49f1-a373-5cbe2212ec76-kube-api-access-zcsjg\") pod \"apiserver-76f77b778f-d56td\" (UID: \"7adc2094-bd76-49f1-a373-5cbe2212ec76\") " pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.048675 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2l97\" (UniqueName: \"kubernetes.io/projected/ff24fe21-0695-45eb-a15b-150aa14b1a58-kube-api-access-r2l97\") pod \"apiserver-7bbb656c7d-z82bh\" (UID: \"ff24fe21-0695-45eb-a15b-150aa14b1a58\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.068884 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntmq\" (UniqueName: \"kubernetes.io/projected/960d80ae-7893-4dde-a757-fb45b8725c8e-kube-api-access-gntmq\") pod \"controller-manager-879f6c89f-pns4h\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.090365 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzc9\" (UniqueName: \"kubernetes.io/projected/8077625e-f489-46b7-9b01-66a13c82f649-kube-api-access-kzzc9\") pod \"machine-api-operator-5694c8668f-7wd7t\" (UID: \"8077625e-f489-46b7-9b01-66a13c82f649\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.109619 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gtb7\" (UniqueName: \"kubernetes.io/projected/4d5e251e-34cd-4620-90fc-0683fa3ed2ef-kube-api-access-2gtb7\") pod \"authentication-operator-69f744f599-m97d7\" (UID: \"4d5e251e-34cd-4620-90fc-0683fa3ed2ef\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.127708 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qj5j\" (UniqueName: \"kubernetes.io/projected/8863b23d-8bed-4a8c-af20-ce709512fc8b-kube-api-access-8qj5j\") pod \"openshift-apiserver-operator-796bbdcf4f-zdmd8\" (UID: \"8863b23d-8bed-4a8c-af20-ce709512fc8b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.139001 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.150749 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppvh\" (UniqueName: \"kubernetes.io/projected/83681e9d-cef1-4622-a31a-78d0db757f03-kube-api-access-nppvh\") pod \"machine-approver-56656f9798-pcfbh\" (UID: \"83681e9d-cef1-4622-a31a-78d0db757f03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.153891 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.165732 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.173192 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.181610 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.194021 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.222815 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.234206 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.245302 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.255196 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.274530 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.297346 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.313482 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.329684 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.333403 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.351352 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.351430 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7wd7t"] Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.352942 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 26 00:09:31 crc kubenswrapper[4874]: W0126 00:09:31.366283 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8077625e_f489_46b7_9b01_66a13c82f649.slice/crio-367f35e415b5498a17912572c58b183deb2e485d344b37da8ddb67f0564aacfd WatchSource:0}: Error finding container 367f35e415b5498a17912572c58b183deb2e485d344b37da8ddb67f0564aacfd: Status 404 returned error can't find the container with id 367f35e415b5498a17912572c58b183deb2e485d344b37da8ddb67f0564aacfd Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.370827 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.373326 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.382546 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8"] Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.387535 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:31 crc kubenswrapper[4874]: E0126 00:09:31.387756 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:11:33.387701614 +0000 UTC m=+268.121808151 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:31 crc kubenswrapper[4874]: W0126 00:09:31.391487 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8863b23d_8bed_4a8c_af20_ce709512fc8b.slice/crio-fc9599851a3d8584806bf305c8c094ed591b88a6ed97672c48c6dd4a0ebf14c8 WatchSource:0}: Error finding container fc9599851a3d8584806bf305c8c094ed591b88a6ed97672c48c6dd4a0ebf14c8: Status 404 returned error can't find the container with id fc9599851a3d8584806bf305c8c094ed591b88a6ed97672c48c6dd4a0ebf14c8 Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.393202 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.407192 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.415149 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.426187 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6"] Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.429456 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" event={"ID":"8863b23d-8bed-4a8c-af20-ce709512fc8b","Type":"ContainerStarted","Data":"fc9599851a3d8584806bf305c8c094ed591b88a6ed97672c48c6dd4a0ebf14c8"} Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.433909 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 26 00:09:31 crc kubenswrapper[4874]: W0126 00:09:31.438308 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83681e9d_cef1_4622_a31a_78d0db757f03.slice/crio-4fd7eeb52d704e4fd36265b593d8062a83d41a781c5a1a920a58626d6ff5a7eb WatchSource:0}: Error finding container 4fd7eeb52d704e4fd36265b593d8062a83d41a781c5a1a920a58626d6ff5a7eb: Status 404 returned error can't find the container with id 4fd7eeb52d704e4fd36265b593d8062a83d41a781c5a1a920a58626d6ff5a7eb Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.438573 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" event={"ID":"8077625e-f489-46b7-9b01-66a13c82f649","Type":"ContainerStarted","Data":"367f35e415b5498a17912572c58b183deb2e485d344b37da8ddb67f0564aacfd"} Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.469954 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.473241 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.488219 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh"] Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.493114 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.500708 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.500777 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.500817 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.500869 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.502645 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.508777 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.510527 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.524982 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.526043 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.535207 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.555796 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.572804 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d56td"] Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.574507 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.593608 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-m97d7"] Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.597144 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.646888 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.650031 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.650328 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.652653 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.662003 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.677472 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.678276 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.692649 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.697155 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pns4h"] Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.714711 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 26 00:09:31 crc kubenswrapper[4874]: W0126 00:09:31.730517 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7adc2094_bd76_49f1_a373_5cbe2212ec76.slice/crio-c4bbe8b1a56e3c90e5b2e19ef42e400e538c7ee5f0c7a1c4ae6271ac5f099f39 WatchSource:0}: Error finding container c4bbe8b1a56e3c90e5b2e19ef42e400e538c7ee5f0c7a1c4ae6271ac5f099f39: Status 404 returned error can't find the container with id c4bbe8b1a56e3c90e5b2e19ef42e400e538c7ee5f0c7a1c4ae6271ac5f099f39 Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.738416 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.753893 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.773832 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.793185 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.816981 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.835226 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.854476 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.899769 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb7fefa5-3d99-467d-b674-a013b8d2374f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.919983 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mkq\" (UniqueName: \"kubernetes.io/projected/1413e57b-4a4d-4ae9-902f-a1e5fc080588-kube-api-access-x5mkq\") pod \"image-pruner-29489760-gbhtf\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.932041 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4lt\" (UniqueName: \"kubernetes.io/projected/b8e09f0d-63bf-4b8d-9485-e884d87eb6a6-kube-api-access-xf4lt\") pod \"etcd-operator-b45778765-lzswr\" (UID: \"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.955467 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptwt\" (UniqueName: \"kubernetes.io/projected/cb7fefa5-3d99-467d-b674-a013b8d2374f-kube-api-access-fptwt\") pod \"cluster-image-registry-operator-dc59b4c8b-87jnw\" (UID: \"cb7fefa5-3d99-467d-b674-a013b8d2374f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.972410 4874 request.go:700] Waited for 1.89917541s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/serviceaccounts/router/token Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.978919 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.986772 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjkm\" (UniqueName: \"kubernetes.io/projected/d3e058cc-ca8f-4856-95d1-37b33389f471-kube-api-access-tdjkm\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmxkb\" (UID: \"d3e058cc-ca8f-4856-95d1-37b33389f471\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:31 crc kubenswrapper[4874]: I0126 00:09:31.991148 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x85zc\" (UniqueName: \"kubernetes.io/projected/aa444927-3d15-42ba-b2aa-0ea838d8726a-kube-api-access-x85zc\") pod \"router-default-5444994796-tmxht\" (UID: \"aa444927-3d15-42ba-b2aa-0ea838d8726a\") " pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.000461 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.018167 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4kr\" (UniqueName: \"kubernetes.io/projected/92ad92d9-cb0d-4c2a-a19d-e627164e297a-kube-api-access-cx4kr\") pod \"downloads-7954f5f757-ks9sj\" (UID: \"92ad92d9-cb0d-4c2a-a19d-e627164e297a\") " pod="openshift-console/downloads-7954f5f757-ks9sj" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.034242 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.035116 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrqp\" (UniqueName: \"kubernetes.io/projected/6a6a602f-63bf-44ed-bb17-f1965f4be17b-kube-api-access-cjrqp\") pod \"migrator-59844c95c7-zd6c2\" (UID: \"6a6a602f-63bf-44ed-bb17-f1965f4be17b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.063509 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.073490 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.093836 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.114147 4874 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.133018 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.154250 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.161535 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.174763 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.199811 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ks9sj" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.200049 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.210241 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.212455 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.218416 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw"] Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.265821 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-tls\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.265860 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.265897 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02f0e3a2-429a-4997-bd5d-a7d808d97282-metrics-tls\") pod \"dns-operator-744455d44c-mcbjn\" (UID: \"02f0e3a2-429a-4997-bd5d-a7d808d97282\") " pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.265918 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7ea5432-a092-4b86-99c0-04ec6257f221-console-oauth-config\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.265945 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.265974 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr4qw\" (UniqueName: \"kubernetes.io/projected/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-kube-api-access-jr4qw\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.265991 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266009 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266028 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266046 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ddb705c-de1d-440f-826e-b539340b1eca-trusted-ca\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266063 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksq99\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-kube-api-access-ksq99\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266077 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266093 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b7528a0-dc0c-4496-9209-81aecb79f639-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-56tlt\" (UID: \"5b7528a0-dc0c-4496-9209-81aecb79f639\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266110 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-trusted-ca-bundle\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266126 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-dir\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266161 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266182 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e4be981-c476-4a3d-a3a5-85e3a30b8376-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266205 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266221 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-bound-sa-token\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266239 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdhdm\" (UniqueName: \"kubernetes.io/projected/0ddb705c-de1d-440f-826e-b539340b1eca-kube-api-access-mdhdm\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266258 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.266275 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-trusted-ca\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268050 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/301f288a-4f83-48c0-aeef-496a3f814ee2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268078 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268106 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtjj\" (UniqueName: \"kubernetes.io/projected/5b7528a0-dc0c-4496-9209-81aecb79f639-kube-api-access-drtjj\") pod \"cluster-samples-operator-665b6dd947-56tlt\" (UID: \"5b7528a0-dc0c-4496-9209-81aecb79f639\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268121 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268138 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ea5432-a092-4b86-99c0-04ec6257f221-console-serving-cert\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268154 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-policies\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268169 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9sj\" (UniqueName: \"kubernetes.io/projected/cabfc520-9523-45b8-8be1-57e71788c5a7-kube-api-access-5j9sj\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268186 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-console-config\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268203 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268221 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268240 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-oauth-serving-cert\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268258 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6267p\" (UniqueName: \"kubernetes.io/projected/301f288a-4f83-48c0-aeef-496a3f814ee2-kube-api-access-6267p\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268273 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268289 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj2pq\" (UniqueName: \"kubernetes.io/projected/02f0e3a2-429a-4997-bd5d-a7d808d97282-kube-api-access-fj2pq\") pod \"dns-operator-744455d44c-mcbjn\" (UID: \"02f0e3a2-429a-4997-bd5d-a7d808d97282\") " pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268306 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghdk\" (UniqueName: \"kubernetes.io/projected/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-kube-api-access-9ghdk\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268324 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-service-ca\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268340 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ddb705c-de1d-440f-826e-b539340b1eca-serving-cert\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268358 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268377 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-serving-cert\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268407 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e4be981-c476-4a3d-a3a5-85e3a30b8376-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268427 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-certificates\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268445 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/301f288a-4f83-48c0-aeef-496a3f814ee2-trusted-ca\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268465 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/301f288a-4f83-48c0-aeef-496a3f814ee2-metrics-tls\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268482 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpmk\" (UniqueName: \"kubernetes.io/projected/b7ea5432-a092-4b86-99c0-04ec6257f221-kube-api-access-7tpmk\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.268507 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ddb705c-de1d-440f-826e-b539340b1eca-config\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.268742 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:32.768726035 +0000 UTC m=+147.502832572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.271948 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.287498 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.323469 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb"] Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.383921 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384201 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b287aa8-5b0e-449a-bd5e-8561d3c74287-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2cnr6\" (UID: \"5b287aa8-5b0e-449a-bd5e-8561d3c74287\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384242 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7f8s\" (UniqueName: \"kubernetes.io/projected/be0c7c98-d60e-40cb-bec7-0685c2ef11a2-kube-api-access-h7f8s\") pod \"package-server-manager-789f6589d5-xvzj9\" (UID: \"be0c7c98-d60e-40cb-bec7-0685c2ef11a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384274 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-node-bootstrap-token\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384306 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384336 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4132d14-d050-466d-8c03-71a3fd5f08a5-secret-volume\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384367 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8rk4\" (UniqueName: \"kubernetes.io/projected/42350fca-6366-4ec7-8b0c-2160cb713c80-kube-api-access-s8rk4\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384394 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c604832e-4a13-4848-9deb-259c130a4881-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p65x9\" (UID: \"c604832e-4a13-4848-9deb-259c130a4881\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384419 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-certificates\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384449 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384496 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384522 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42350fca-6366-4ec7-8b0c-2160cb713c80-apiservice-cert\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384547 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/301f288a-4f83-48c0-aeef-496a3f814ee2-metrics-tls\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384594 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384637 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02f0e3a2-429a-4997-bd5d-a7d808d97282-metrics-tls\") pod \"dns-operator-744455d44c-mcbjn\" (UID: \"02f0e3a2-429a-4997-bd5d-a7d808d97282\") " pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384671 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr4qw\" (UniqueName: \"kubernetes.io/projected/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-kube-api-access-jr4qw\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384703 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2630ca0-dedc-4fc8-96b7-deddea64e443-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384743 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h9px\" (UniqueName: \"kubernetes.io/projected/e8fd9d9e-4607-4d4d-95f9-c18c37440d65-kube-api-access-8h9px\") pod \"ingress-canary-4rsrb\" (UID: \"e8fd9d9e-4607-4d4d-95f9-c18c37440d65\") " pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384790 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384829 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ddb705c-de1d-440f-826e-b539340b1eca-trusted-ca\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384856 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksq99\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-kube-api-access-ksq99\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384886 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gg8z\" (UniqueName: \"kubernetes.io/projected/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-kube-api-access-7gg8z\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.384917 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-trusted-ca-bundle\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385252 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385289 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppc9\" (UniqueName: \"kubernetes.io/projected/e9650625-1b67-4b67-8a69-d8738bf6d762-kube-api-access-mppc9\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385332 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-csi-data-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385366 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854a7fc0-a802-478f-98c3-3e1c494c40d9-config\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385434 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e4be981-c476-4a3d-a3a5-85e3a30b8376-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385466 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-mountpoint-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385492 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/42350fca-6366-4ec7-8b0c-2160cb713c80-tmpfs\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385523 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-bound-sa-token\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385564 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdhdm\" (UniqueName: \"kubernetes.io/projected/0ddb705c-de1d-440f-826e-b539340b1eca-kube-api-access-mdhdm\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.385593 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:32.885570428 +0000 UTC m=+147.619676965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385623 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385672 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-trusted-ca\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385701 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385732 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rlw5\" (UniqueName: \"kubernetes.io/projected/e02ae627-2219-4dbd-b96e-219b902d594a-kube-api-access-8rlw5\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385762 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-registration-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385813 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4132d14-d050-466d-8c03-71a3fd5f08a5-config-volume\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385839 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e02ae627-2219-4dbd-b96e-219b902d594a-serving-cert\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385862 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/be0c7c98-d60e-40cb-bec7-0685c2ef11a2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xvzj9\" (UID: \"be0c7c98-d60e-40cb-bec7-0685c2ef11a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385909 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drtjj\" (UniqueName: \"kubernetes.io/projected/5b7528a0-dc0c-4496-9209-81aecb79f639-kube-api-access-drtjj\") pod \"cluster-samples-operator-665b6dd947-56tlt\" (UID: \"5b7528a0-dc0c-4496-9209-81aecb79f639\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.385940 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-proxy-tls\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.386023 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz6vr\" (UniqueName: \"kubernetes.io/projected/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-kube-api-access-jz6vr\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.386050 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8fd9d9e-4607-4d4d-95f9-c18c37440d65-cert\") pod \"ingress-canary-4rsrb\" (UID: \"e8fd9d9e-4607-4d4d-95f9-c18c37440d65\") " pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387166 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-console-config\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387204 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387284 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387339 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj2pq\" (UniqueName: \"kubernetes.io/projected/02f0e3a2-429a-4997-bd5d-a7d808d97282-kube-api-access-fj2pq\") pod \"dns-operator-744455d44c-mcbjn\" (UID: \"02f0e3a2-429a-4997-bd5d-a7d808d97282\") " pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387381 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghdk\" (UniqueName: \"kubernetes.io/projected/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-kube-api-access-9ghdk\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387437 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2630ca0-dedc-4fc8-96b7-deddea64e443-config\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387459 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387509 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtcwg\" (UniqueName: \"kubernetes.io/projected/e4132d14-d050-466d-8c03-71a3fd5f08a5-kube-api-access-xtcwg\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387575 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-serving-cert\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.387620 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e4be981-c476-4a3d-a3a5-85e3a30b8376-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.389825 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/301f288a-4f83-48c0-aeef-496a3f814ee2-trusted-ca\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.389886 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp45w\" (UniqueName: \"kubernetes.io/projected/5b287aa8-5b0e-449a-bd5e-8561d3c74287-kube-api-access-xp45w\") pod \"control-plane-machine-set-operator-78cbb6b69f-2cnr6\" (UID: \"5b287aa8-5b0e-449a-bd5e-8561d3c74287\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390076 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpqtk\" (UniqueName: \"kubernetes.io/projected/8aa0d0bc-1d01-402d-a9aa-97a984717116-kube-api-access-hpqtk\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390152 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpmk\" (UniqueName: \"kubernetes.io/projected/b7ea5432-a092-4b86-99c0-04ec6257f221-kube-api-access-7tpmk\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390194 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ddb705c-de1d-440f-826e-b539340b1eca-config\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390230 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-tls\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390265 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vbtb\" (UniqueName: \"kubernetes.io/projected/df3f7986-e9c7-4647-9a1e-d509480a4824-kube-api-access-7vbtb\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390295 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-socket-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390360 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xck6k\" (UniqueName: \"kubernetes.io/projected/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-kube-api-access-xck6k\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390401 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dks27\" (UniqueName: \"kubernetes.io/projected/c604832e-4a13-4848-9deb-259c130a4881-kube-api-access-dks27\") pod \"multus-admission-controller-857f4d67dd-p65x9\" (UID: \"c604832e-4a13-4848-9deb-259c130a4881\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390481 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7ea5432-a092-4b86-99c0-04ec6257f221-console-oauth-config\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390526 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.390667 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.396991 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02f0e3a2-429a-4997-bd5d-a7d808d97282-metrics-tls\") pod \"dns-operator-744455d44c-mcbjn\" (UID: \"02f0e3a2-429a-4997-bd5d-a7d808d97282\") " pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.400234 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.400780 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.400872 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.400923 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzs2\" (UniqueName: \"kubernetes.io/projected/165b8f95-910a-46b7-9e16-c3b813e54831-kube-api-access-lwzs2\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401090 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df3f7986-e9c7-4647-9a1e-d509480a4824-srv-cert\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401155 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d43627d2-c1f4-4335-ad78-0f49d3be3aed-signing-cabundle\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401191 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401223 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b7528a0-dc0c-4496-9209-81aecb79f639-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-56tlt\" (UID: \"5b7528a0-dc0c-4496-9209-81aecb79f639\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401483 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t82m\" (UniqueName: \"kubernetes.io/projected/9dc8424a-928a-4d4d-a1e7-02429361ecfb-kube-api-access-9t82m\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401634 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-dir\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401709 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401796 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-trusted-ca\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401831 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401928 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2630ca0-dedc-4fc8-96b7-deddea64e443-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.401988 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/854a7fc0-a802-478f-98c3-3e1c494c40d9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.402159 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.402195 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e9650625-1b67-4b67-8a69-d8738bf6d762-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.402243 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e9650625-1b67-4b67-8a69-d8738bf6d762-srv-cert\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.402310 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/301f288a-4f83-48c0-aeef-496a3f814ee2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.402395 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.403009 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.404170 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-console-config\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.404548 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ddb705c-de1d-440f-826e-b539340b1eca-trusted-ca\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.404855 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.405626 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-config-volume\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.405698 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-metrics-tls\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.405762 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42350fca-6366-4ec7-8b0c-2160cb713c80-webhook-cert\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.405797 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/165b8f95-910a-46b7-9e16-c3b813e54831-proxy-tls\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.405828 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/165b8f95-910a-46b7-9e16-c3b813e54831-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.406993 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.407627 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.407821 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854a7fc0-a802-478f-98c3-3e1c494c40d9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.407911 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.407993 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ea5432-a092-4b86-99c0-04ec6257f221-console-serving-cert\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408032 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02ae627-2219-4dbd-b96e-219b902d594a-config\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408071 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-policies\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408101 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9sj\" (UniqueName: \"kubernetes.io/projected/cabfc520-9523-45b8-8be1-57e71788c5a7-kube-api-access-5j9sj\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408139 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz4c9\" (UniqueName: \"kubernetes.io/projected/d43627d2-c1f4-4335-ad78-0f49d3be3aed-kube-api-access-zz4c9\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408203 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-certs\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408262 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-oauth-serving-cert\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408301 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408336 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-plugins-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408404 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408437 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df3f7986-e9c7-4647-9a1e-d509480a4824-profile-collector-cert\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408466 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6267p\" (UniqueName: \"kubernetes.io/projected/301f288a-4f83-48c0-aeef-496a3f814ee2-kube-api-access-6267p\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408505 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d43627d2-c1f4-4335-ad78-0f49d3be3aed-signing-key\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408540 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-service-ca\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408576 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ddb705c-de1d-440f-826e-b539340b1eca-serving-cert\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408607 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/165b8f95-910a-46b7-9e16-c3b813e54831-images\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.408622 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-certificates\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.409744 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:32.90972408 +0000 UTC m=+147.643830617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.411378 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-dir\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.413265 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-trusted-ca-bundle\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.414823 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-serving-cert\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.421608 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-service-ca\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.421848 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.422044 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.422123 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b7528a0-dc0c-4496-9209-81aecb79f639-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-56tlt\" (UID: \"5b7528a0-dc0c-4496-9209-81aecb79f639\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.423366 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-policies\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.424387 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29489760-gbhtf"] Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.424420 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ddb705c-de1d-440f-826e-b539340b1eca-config\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.429488 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.430220 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7ea5432-a092-4b86-99c0-04ec6257f221-console-oauth-config\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.431595 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ea5432-a092-4b86-99c0-04ec6257f221-console-serving-cert\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.432290 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.438744 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e4be981-c476-4a3d-a3a5-85e3a30b8376-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.442732 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e4be981-c476-4a3d-a3a5-85e3a30b8376-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.462127 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/301f288a-4f83-48c0-aeef-496a3f814ee2-metrics-tls\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.480886 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7ea5432-a092-4b86-99c0-04ec6257f221-oauth-serving-cert\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.482324 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/301f288a-4f83-48c0-aeef-496a3f814ee2-trusted-ca\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.482612 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.483203 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.485807 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdhdm\" (UniqueName: \"kubernetes.io/projected/0ddb705c-de1d-440f-826e-b539340b1eca-kube-api-access-mdhdm\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.484228 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.487129 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ddb705c-de1d-440f-826e-b539340b1eca-serving-cert\") pod \"console-operator-58897d9998-xqz88\" (UID: \"0ddb705c-de1d-440f-826e-b539340b1eca\") " pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.489328 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-tls\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.507234 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksq99\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-kube-api-access-ksq99\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.509428 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.509487 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.009469136 +0000 UTC m=+147.743575673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511171 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzs2\" (UniqueName: \"kubernetes.io/projected/165b8f95-910a-46b7-9e16-c3b813e54831-kube-api-access-lwzs2\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511204 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df3f7986-e9c7-4647-9a1e-d509480a4824-srv-cert\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511227 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d43627d2-c1f4-4335-ad78-0f49d3be3aed-signing-cabundle\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511245 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t82m\" (UniqueName: \"kubernetes.io/projected/9dc8424a-928a-4d4d-a1e7-02429361ecfb-kube-api-access-9t82m\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511337 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511373 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511391 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2630ca0-dedc-4fc8-96b7-deddea64e443-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511414 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/854a7fc0-a802-478f-98c3-3e1c494c40d9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511433 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e9650625-1b67-4b67-8a69-d8738bf6d762-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511450 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e9650625-1b67-4b67-8a69-d8738bf6d762-srv-cert\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511473 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-config-volume\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511490 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-metrics-tls\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511506 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42350fca-6366-4ec7-8b0c-2160cb713c80-webhook-cert\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511522 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/165b8f95-910a-46b7-9e16-c3b813e54831-proxy-tls\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511543 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/165b8f95-910a-46b7-9e16-c3b813e54831-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511562 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854a7fc0-a802-478f-98c3-3e1c494c40d9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511588 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02ae627-2219-4dbd-b96e-219b902d594a-config\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511611 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz4c9\" (UniqueName: \"kubernetes.io/projected/d43627d2-c1f4-4335-ad78-0f49d3be3aed-kube-api-access-zz4c9\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511630 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-certs\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511647 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-plugins-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511665 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.511682 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df3f7986-e9c7-4647-9a1e-d509480a4824-profile-collector-cert\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516475 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d43627d2-c1f4-4335-ad78-0f49d3be3aed-signing-key\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516515 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/165b8f95-910a-46b7-9e16-c3b813e54831-images\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516532 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b287aa8-5b0e-449a-bd5e-8561d3c74287-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2cnr6\" (UID: \"5b287aa8-5b0e-449a-bd5e-8561d3c74287\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516551 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7f8s\" (UniqueName: \"kubernetes.io/projected/be0c7c98-d60e-40cb-bec7-0685c2ef11a2-kube-api-access-h7f8s\") pod \"package-server-manager-789f6589d5-xvzj9\" (UID: \"be0c7c98-d60e-40cb-bec7-0685c2ef11a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516569 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-node-bootstrap-token\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516592 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4132d14-d050-466d-8c03-71a3fd5f08a5-secret-volume\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516609 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8rk4\" (UniqueName: \"kubernetes.io/projected/42350fca-6366-4ec7-8b0c-2160cb713c80-kube-api-access-s8rk4\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516624 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c604832e-4a13-4848-9deb-259c130a4881-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p65x9\" (UID: \"c604832e-4a13-4848-9deb-259c130a4881\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516641 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516658 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516674 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42350fca-6366-4ec7-8b0c-2160cb713c80-apiservice-cert\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516707 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2630ca0-dedc-4fc8-96b7-deddea64e443-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516726 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h9px\" (UniqueName: \"kubernetes.io/projected/e8fd9d9e-4607-4d4d-95f9-c18c37440d65-kube-api-access-8h9px\") pod \"ingress-canary-4rsrb\" (UID: \"e8fd9d9e-4607-4d4d-95f9-c18c37440d65\") " pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516748 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gg8z\" (UniqueName: \"kubernetes.io/projected/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-kube-api-access-7gg8z\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516776 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516793 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mppc9\" (UniqueName: \"kubernetes.io/projected/e9650625-1b67-4b67-8a69-d8738bf6d762-kube-api-access-mppc9\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516812 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-csi-data-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516827 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854a7fc0-a802-478f-98c3-3e1c494c40d9-config\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516845 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-mountpoint-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516859 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/42350fca-6366-4ec7-8b0c-2160cb713c80-tmpfs\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516892 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rlw5\" (UniqueName: \"kubernetes.io/projected/e02ae627-2219-4dbd-b96e-219b902d594a-kube-api-access-8rlw5\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516908 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-registration-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516935 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4132d14-d050-466d-8c03-71a3fd5f08a5-config-volume\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516950 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e02ae627-2219-4dbd-b96e-219b902d594a-serving-cert\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.516979 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/be0c7c98-d60e-40cb-bec7-0685c2ef11a2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xvzj9\" (UID: \"be0c7c98-d60e-40cb-bec7-0685c2ef11a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.517006 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-proxy-tls\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.517692 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2630ca0-dedc-4fc8-96b7-deddea64e443-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.518503 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d43627d2-c1f4-4335-ad78-0f49d3be3aed-signing-cabundle\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.518763 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz6vr\" (UniqueName: \"kubernetes.io/projected/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-kube-api-access-jz6vr\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.518837 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8fd9d9e-4607-4d4d-95f9-c18c37440d65-cert\") pod \"ingress-canary-4rsrb\" (UID: \"e8fd9d9e-4607-4d4d-95f9-c18c37440d65\") " pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.518930 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2630ca0-dedc-4fc8-96b7-deddea64e443-config\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.519011 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.519087 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtcwg\" (UniqueName: \"kubernetes.io/projected/e4132d14-d050-466d-8c03-71a3fd5f08a5-kube-api-access-xtcwg\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.519121 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.019092154 +0000 UTC m=+147.753198691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.519267 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp45w\" (UniqueName: \"kubernetes.io/projected/5b287aa8-5b0e-449a-bd5e-8561d3c74287-kube-api-access-xp45w\") pod \"control-plane-machine-set-operator-78cbb6b69f-2cnr6\" (UID: \"5b287aa8-5b0e-449a-bd5e-8561d3c74287\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.520076 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpqtk\" (UniqueName: \"kubernetes.io/projected/8aa0d0bc-1d01-402d-a9aa-97a984717116-kube-api-access-hpqtk\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.520205 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vbtb\" (UniqueName: \"kubernetes.io/projected/df3f7986-e9c7-4647-9a1e-d509480a4824-kube-api-access-7vbtb\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.520274 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-socket-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.520341 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xck6k\" (UniqueName: \"kubernetes.io/projected/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-kube-api-access-xck6k\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.520413 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dks27\" (UniqueName: \"kubernetes.io/projected/c604832e-4a13-4848-9deb-259c130a4881-kube-api-access-dks27\") pod \"multus-admission-controller-857f4d67dd-p65x9\" (UID: \"c604832e-4a13-4848-9deb-259c130a4881\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.521436 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.521695 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-csi-data-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.521904 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02ae627-2219-4dbd-b96e-219b902d594a-config\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.523843 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" event={"ID":"cb7fefa5-3d99-467d-b674-a013b8d2374f","Type":"ContainerStarted","Data":"78fb2e265c0b21fe40737793457074e819aab088543456e49d1b1385442833a3"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.524401 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/165b8f95-910a-46b7-9e16-c3b813e54831-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.525134 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854a7fc0-a802-478f-98c3-3e1c494c40d9-config\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.525406 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-mountpoint-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.525747 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/42350fca-6366-4ec7-8b0c-2160cb713c80-tmpfs\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.527629 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-registration-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.528846 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-config-volume\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.529795 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/165b8f95-910a-46b7-9e16-c3b813e54831-images\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.530372 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42350fca-6366-4ec7-8b0c-2160cb713c80-webhook-cert\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.530808 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtjj\" (UniqueName: \"kubernetes.io/projected/5b7528a0-dc0c-4496-9209-81aecb79f639-kube-api-access-drtjj\") pod \"cluster-samples-operator-665b6dd947-56tlt\" (UID: \"5b7528a0-dc0c-4496-9209-81aecb79f639\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.531178 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e9650625-1b67-4b67-8a69-d8738bf6d762-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.531430 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.535128 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4132d14-d050-466d-8c03-71a3fd5f08a5-config-volume\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.532257 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj2pq\" (UniqueName: \"kubernetes.io/projected/02f0e3a2-429a-4997-bd5d-a7d808d97282-kube-api-access-fj2pq\") pod \"dns-operator-744455d44c-mcbjn\" (UID: \"02f0e3a2-429a-4997-bd5d-a7d808d97282\") " pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.532607 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr4qw\" (UniqueName: \"kubernetes.io/projected/3f7e087e-3f78-4e39-81e9-ad6c357bb27b-kube-api-access-jr4qw\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tfbt\" (UID: \"3f7e087e-3f78-4e39-81e9-ad6c357bb27b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.532610 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e9650625-1b67-4b67-8a69-d8738bf6d762-srv-cert\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.532892 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-metrics-tls\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.533134 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.531700 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/165b8f95-910a-46b7-9e16-c3b813e54831-proxy-tls\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.533770 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-socket-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.533903 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-node-bootstrap-token\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.534209 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.535173 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"99da47edc896d8f6f674562e876d7bdb2e92f07e717ab26fd9c705767de67fed"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.539046 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b9e68b49327f1b20327fd3d78015ec6871938c2e66e97351a5323c641ee41f06"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.538836 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9dc8424a-928a-4d4d-a1e7-02429361ecfb-plugins-dir\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.538899 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42350fca-6366-4ec7-8b0c-2160cb713c80-apiservice-cert\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.538793 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df3f7986-e9c7-4647-9a1e-d509480a4824-profile-collector-cert\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.539183 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2630ca0-dedc-4fc8-96b7-deddea64e443-config\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.539303 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29489760-gbhtf" event={"ID":"1413e57b-4a4d-4ae9-902f-a1e5fc080588","Type":"ContainerStarted","Data":"42fa368b26bb13eecf29705bd3953d3261055ec55a082a0258fab4fd3a2f7088"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.533692 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df3f7986-e9c7-4647-9a1e-d509480a4824-srv-cert\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.540572 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.540888 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-proxy-tls\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.542306 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" event={"ID":"960d80ae-7893-4dde-a757-fb45b8725c8e","Type":"ContainerStarted","Data":"08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.542352 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" event={"ID":"960d80ae-7893-4dde-a757-fb45b8725c8e","Type":"ContainerStarted","Data":"2ba63b5502bc0c1a7b45584aa5a0b4fb2f4e88c035581728f552f10712673ed5"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.543288 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/be0c7c98-d60e-40cb-bec7-0685c2ef11a2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xvzj9\" (UID: \"be0c7c98-d60e-40cb-bec7-0685c2ef11a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.543363 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.543837 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d43627d2-c1f4-4335-ad78-0f49d3be3aed-signing-key\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.546637 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.547084 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" event={"ID":"4d5e251e-34cd-4620-90fc-0683fa3ed2ef","Type":"ContainerStarted","Data":"3497778566d45bca3ce4f6221ed34c90f3c4999686e41634b8434ba089991605"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.547120 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" event={"ID":"4d5e251e-34cd-4620-90fc-0683fa3ed2ef","Type":"ContainerStarted","Data":"6857a50262d456cb99a0314261ae7a77b391256e0876a22d11ae1409b2688edd"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.548186 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.549274 4874 generic.go:334] "Generic (PLEG): container finished" podID="7adc2094-bd76-49f1-a373-5cbe2212ec76" containerID="a88ba8c2b0742deef847f863e3ef0315976daf01d6e8bff126a08b86c8cb89dc" exitCode=0 Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.549356 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d56td" event={"ID":"7adc2094-bd76-49f1-a373-5cbe2212ec76","Type":"ContainerDied","Data":"a88ba8c2b0742deef847f863e3ef0315976daf01d6e8bff126a08b86c8cb89dc"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.549390 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d56td" event={"ID":"7adc2094-bd76-49f1-a373-5cbe2212ec76","Type":"ContainerStarted","Data":"c4bbe8b1a56e3c90e5b2e19ef42e400e538c7ee5f0c7a1c4ae6271ac5f099f39"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.555120 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghdk\" (UniqueName: \"kubernetes.io/projected/09eba8a2-56e9-4722-9d7f-c041b2ab6d36-kube-api-access-9ghdk\") pod \"openshift-config-operator-7777fb866f-7qhd5\" (UID: \"09eba8a2-56e9-4722-9d7f-c041b2ab6d36\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.555354 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4132d14-d050-466d-8c03-71a3fd5f08a5-secret-volume\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.555679 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.557225 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e02ae627-2219-4dbd-b96e-219b902d594a-serving-cert\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.557929 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" event={"ID":"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f","Type":"ContainerStarted","Data":"d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.560665 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" event={"ID":"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f","Type":"ContainerStarted","Data":"76936ddede7b94a617f20b2d0e6f2326548be5378ddaf97469011d5e527767af"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.558497 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-certs\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.559154 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854a7fc0-a802-478f-98c3-3e1c494c40d9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.561044 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.558856 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c604832e-4a13-4848-9deb-259c130a4881-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p65x9\" (UID: \"c604832e-4a13-4848-9deb-259c130a4881\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.561090 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b287aa8-5b0e-449a-bd5e-8561d3c74287-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2cnr6\" (UID: \"5b287aa8-5b0e-449a-bd5e-8561d3c74287\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.562601 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/301f288a-4f83-48c0-aeef-496a3f814ee2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.563921 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8fd9d9e-4607-4d4d-95f9-c18c37440d65-cert\") pod \"ingress-canary-4rsrb\" (UID: \"e8fd9d9e-4607-4d4d-95f9-c18c37440d65\") " pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.564304 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.580915 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpmk\" (UniqueName: \"kubernetes.io/projected/b7ea5432-a092-4b86-99c0-04ec6257f221-kube-api-access-7tpmk\") pod \"console-f9d7485db-s6njc\" (UID: \"b7ea5432-a092-4b86-99c0-04ec6257f221\") " pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.581603 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a13baf5968893c91d837b20899567dda20a6a32009f82c18ca1856976d01aed7"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.585586 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2d78fd6624f7a1d2aaaba1a2fa3f7801641d8691a1d5d73ae93745b105c27c08"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.586167 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.591050 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-bound-sa-token\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.592106 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" event={"ID":"83681e9d-cef1-4622-a31a-78d0db757f03","Type":"ContainerStarted","Data":"af8e301f87a7540b82e04e9a3100e57c13b6b26acc159cb42487df9d9d69033f"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.592349 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" event={"ID":"83681e9d-cef1-4622-a31a-78d0db757f03","Type":"ContainerStarted","Data":"a26d0a22e55bc5ee6741bd8b05be8383af71c91dbe2ef2b249f47b7ae2069037"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.592510 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" event={"ID":"83681e9d-cef1-4622-a31a-78d0db757f03","Type":"ContainerStarted","Data":"4fd7eeb52d704e4fd36265b593d8062a83d41a781c5a1a920a58626d6ff5a7eb"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.599143 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" event={"ID":"8077625e-f489-46b7-9b01-66a13c82f649","Type":"ContainerStarted","Data":"51be4202e77cea981a72c570d7d2064661b5feeb343919234847187b7ca4c8ce"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.599187 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" event={"ID":"8077625e-f489-46b7-9b01-66a13c82f649","Type":"ContainerStarted","Data":"92c5b9540b321c9c053b078be40856c2106553d8237afd757437386ffe956a1c"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.608674 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ks9sj"] Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.609628 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" event={"ID":"8863b23d-8bed-4a8c-af20-ce709512fc8b","Type":"ContainerStarted","Data":"6d9a2397164d8b54a7f9503cc09faa92d06d805fb49591bf1d64e96e46378954"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.612679 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9sj\" (UniqueName: \"kubernetes.io/projected/cabfc520-9523-45b8-8be1-57e71788c5a7-kube-api-access-5j9sj\") pod \"oauth-openshift-558db77b4-8k592\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.626786 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.627923 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.127900172 +0000 UTC m=+147.862006709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.631082 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6267p\" (UniqueName: \"kubernetes.io/projected/301f288a-4f83-48c0-aeef-496a3f814ee2-kube-api-access-6267p\") pod \"ingress-operator-5b745b69d9-fr5kd\" (UID: \"301f288a-4f83-48c0-aeef-496a3f814ee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.634628 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" event={"ID":"d3e058cc-ca8f-4856-95d1-37b33389f471","Type":"ContainerStarted","Data":"53309679ab7cdcf6478cbd3203311c6a820e48edff71fcd9261a18c2ee49d762"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.645314 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.669089 4874 generic.go:334] "Generic (PLEG): container finished" podID="ff24fe21-0695-45eb-a15b-150aa14b1a58" containerID="30e1835f0f4b182a6be9175ba83b86a25aad8298a66577620b52b25be2063097" exitCode=0 Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.669210 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" event={"ID":"ff24fe21-0695-45eb-a15b-150aa14b1a58","Type":"ContainerDied","Data":"30e1835f0f4b182a6be9175ba83b86a25aad8298a66577620b52b25be2063097"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.669248 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" event={"ID":"ff24fe21-0695-45eb-a15b-150aa14b1a58","Type":"ContainerStarted","Data":"9ea35a473f0a4f46e66039774725400aa914c03493913d4f3a7b1f45e7c270bd"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.681602 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzs2\" (UniqueName: \"kubernetes.io/projected/165b8f95-910a-46b7-9e16-c3b813e54831-kube-api-access-lwzs2\") pod \"machine-config-operator-74547568cd-gjzmd\" (UID: \"165b8f95-910a-46b7-9e16-c3b813e54831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.687991 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tmxht" event={"ID":"aa444927-3d15-42ba-b2aa-0ea838d8726a","Type":"ContainerStarted","Data":"71da449dc4066323468d6b492483a65fa6571ada67aa3e95969baa41b219f0f6"} Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.694065 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t82m\" (UniqueName: \"kubernetes.io/projected/9dc8424a-928a-4d4d-a1e7-02429361ecfb-kube-api-access-9t82m\") pod \"csi-hostpathplugin-k7t65\" (UID: \"9dc8424a-928a-4d4d-a1e7-02429361ecfb\") " pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.730184 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.732752 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.23273562 +0000 UTC m=+147.966842157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.734203 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.749144 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/854a7fc0-a802-478f-98c3-3e1c494c40d9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zs4sn\" (UID: \"854a7fc0-a802-478f-98c3-3e1c494c40d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.752889 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.753362 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dks27\" (UniqueName: \"kubernetes.io/projected/c604832e-4a13-4848-9deb-259c130a4881-kube-api-access-dks27\") pod \"multus-admission-controller-857f4d67dd-p65x9\" (UID: \"c604832e-4a13-4848-9deb-259c130a4881\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.755846 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2630ca0-dedc-4fc8-96b7-deddea64e443-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lfgxs\" (UID: \"a2630ca0-dedc-4fc8-96b7-deddea64e443\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.780505 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.782719 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h9px\" (UniqueName: \"kubernetes.io/projected/e8fd9d9e-4607-4d4d-95f9-c18c37440d65-kube-api-access-8h9px\") pod \"ingress-canary-4rsrb\" (UID: \"e8fd9d9e-4607-4d4d-95f9-c18c37440d65\") " pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.797669 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.802277 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gg8z\" (UniqueName: \"kubernetes.io/projected/12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb-kube-api-access-7gg8z\") pod \"dns-default-wp989\" (UID: \"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb\") " pod="openshift-dns/dns-default-wp989" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.816089 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppc9\" (UniqueName: \"kubernetes.io/projected/e9650625-1b67-4b67-8a69-d8738bf6d762-kube-api-access-mppc9\") pod \"olm-operator-6b444d44fb-lpbc4\" (UID: \"e9650625-1b67-4b67-8a69-d8738bf6d762\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.825950 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.834150 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.835280 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.335252954 +0000 UTC m=+148.069359491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.858710 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rlw5\" (UniqueName: \"kubernetes.io/projected/e02ae627-2219-4dbd-b96e-219b902d594a-kube-api-access-8rlw5\") pod \"service-ca-operator-777779d784-nrq2s\" (UID: \"e02ae627-2219-4dbd-b96e-219b902d594a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.864493 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpqtk\" (UniqueName: \"kubernetes.io/projected/8aa0d0bc-1d01-402d-a9aa-97a984717116-kube-api-access-hpqtk\") pod \"marketplace-operator-79b997595-t8ffn\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.894344 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7f8s\" (UniqueName: \"kubernetes.io/projected/be0c7c98-d60e-40cb-bec7-0685c2ef11a2-kube-api-access-h7f8s\") pod \"package-server-manager-789f6589d5-xvzj9\" (UID: \"be0c7c98-d60e-40cb-bec7-0685c2ef11a2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.897197 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lzswr"] Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.897359 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.932633 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.936530 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:32 crc kubenswrapper[4874]: E0126 00:09:32.937012 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.436998796 +0000 UTC m=+148.171105333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.939762 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt"] Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.952433 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.952924 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.958229 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2"] Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.963284 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz6vr\" (UniqueName: \"kubernetes.io/projected/e10d7cc7-6177-4a1f-882c-91d6bf3a738f-kube-api-access-jz6vr\") pod \"machine-config-controller-84d6567774-t75tb\" (UID: \"e10d7cc7-6177-4a1f-882c-91d6bf3a738f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.963409 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vbtb\" (UniqueName: \"kubernetes.io/projected/df3f7986-e9c7-4647-9a1e-d509480a4824-kube-api-access-7vbtb\") pod \"catalog-operator-68c6474976-7nm8d\" (UID: \"df3f7986-e9c7-4647-9a1e-d509480a4824\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.965415 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.970723 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtcwg\" (UniqueName: \"kubernetes.io/projected/e4132d14-d050-466d-8c03-71a3fd5f08a5-kube-api-access-xtcwg\") pod \"collect-profiles-29489760-c46dr\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.975541 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8rk4\" (UniqueName: \"kubernetes.io/projected/42350fca-6366-4ec7-8b0c-2160cb713c80-kube-api-access-s8rk4\") pod \"packageserver-d55dfcdfc-sp6fv\" (UID: \"42350fca-6366-4ec7-8b0c-2160cb713c80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.978769 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.984598 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.989876 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp45w\" (UniqueName: \"kubernetes.io/projected/5b287aa8-5b0e-449a-bd5e-8561d3c74287-kube-api-access-xp45w\") pod \"control-plane-machine-set-operator-78cbb6b69f-2cnr6\" (UID: \"5b287aa8-5b0e-449a-bd5e-8561d3c74287\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:32 crc kubenswrapper[4874]: I0126 00:09:32.991858 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.004007 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xck6k\" (UniqueName: \"kubernetes.io/projected/277abf5e-0d7e-469b-916e-d6d8bb7b75d3-kube-api-access-xck6k\") pod \"machine-config-server-x7ct8\" (UID: \"277abf5e-0d7e-469b-916e-d6d8bb7b75d3\") " pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.004327 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.010914 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4rsrb" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.011786 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85fbf881-a4a3-4e2a-9c33-b332dfcf7d65-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwrvc\" (UID: \"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.034540 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz4c9\" (UniqueName: \"kubernetes.io/projected/d43627d2-c1f4-4335-ad78-0f49d3be3aed-kube-api-access-zz4c9\") pod \"service-ca-9c57cc56f-ndg7w\" (UID: \"d43627d2-c1f4-4335-ad78-0f49d3be3aed\") " pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.044233 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wp989" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.049377 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5"] Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.050184 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.050604 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.550576197 +0000 UTC m=+148.284682734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.155483 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.155984 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.655970391 +0000 UTC m=+148.390076928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.210647 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.217573 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.218284 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.227119 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.237334 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.244190 4874 csr.go:261] certificate signing request csr-2crln is approved, waiting to be issued Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.246984 4874 csr.go:257] certificate signing request csr-2crln is issued Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.260514 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.261131 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.261259 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.761235881 +0000 UTC m=+148.495342418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.273302 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.302946 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x7ct8" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.321718 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k7t65"] Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.364791 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.365217 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.865203244 +0000 UTC m=+148.599309781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.456369 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" podStartSLOduration=124.456349511 podStartE2EDuration="2m4.456349511s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:33.430508942 +0000 UTC m=+148.164615479" watchObservedRunningTime="2026-01-26 00:09:33.456349511 +0000 UTC m=+148.190456048" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.468536 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.468779 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:33.968760847 +0000 UTC m=+148.702867384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.571440 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.571856 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.071844276 +0000 UTC m=+148.805950803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.663724 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pcfbh" podStartSLOduration=125.663708103 podStartE2EDuration="2m5.663708103s" podCreationTimestamp="2026-01-26 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:33.66253841 +0000 UTC m=+148.396644947" watchObservedRunningTime="2026-01-26 00:09:33.663708103 +0000 UTC m=+148.397814640" Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.674035 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.674125 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.174106902 +0000 UTC m=+148.908213439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.674361 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.674626 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.174616836 +0000 UTC m=+148.908723373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.740291 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" event={"ID":"cb7fefa5-3d99-467d-b674-a013b8d2374f","Type":"ContainerStarted","Data":"fb1dad999223541ddde459c53c28bf57dde84c7a98227a3cc4cd0742f43f676b"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.757260 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f48c47fe16896d952610fec375d349eb66b70f1dbcb81cae4bbb20f2941e079b"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.774938 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.775415 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.275395981 +0000 UTC m=+149.009502518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.796951 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d56td" event={"ID":"7adc2094-bd76-49f1-a373-5cbe2212ec76","Type":"ContainerStarted","Data":"83e40242122d1409534d74344b8213f04956b2477bcbe4dadb9ce2c4dac200e3"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.807327 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tmxht" event={"ID":"aa444927-3d15-42ba-b2aa-0ea838d8726a","Type":"ContainerStarted","Data":"12594dbfc545585589ced1db3aebfd35259127eb4d0d41ed784b6b21487cfc40"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.810325 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" event={"ID":"d3e058cc-ca8f-4856-95d1-37b33389f471","Type":"ContainerStarted","Data":"98ad213c1b20c0e88c0ed006d0cb764f5146b02263f5ba92d26486fbded7f36a"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.814247 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7062c909057b3fc61b0e2dee8207edba218848c64669551c03308c492f0d5f52"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.815820 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" event={"ID":"6a6a602f-63bf-44ed-bb17-f1965f4be17b","Type":"ContainerStarted","Data":"8dd547b9db604b27d4f0b3b0120c4d930c87ee2f47cf19cf63612b609e94f942"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.876942 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.880024 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.380009673 +0000 UTC m=+149.114116210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.887802 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" event={"ID":"09eba8a2-56e9-4722-9d7f-c041b2ab6d36","Type":"ContainerStarted","Data":"b49fc4d3fcd82f00979a170db31444098dfebf097b75839d35d5d4ddd391c7fc"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.903246 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" event={"ID":"5b7528a0-dc0c-4496-9209-81aecb79f639","Type":"ContainerStarted","Data":"d5132af7b8be109990f1b536002ce89df60b330150866e5e8e71b9084b4e6f6c"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.933675 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" event={"ID":"9dc8424a-928a-4d4d-a1e7-02429361ecfb","Type":"ContainerStarted","Data":"c878dd726a523a483c105b1b07c6e9a4690862531246b186a259feba41176237"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.943332 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" event={"ID":"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6","Type":"ContainerStarted","Data":"c2719aba2e65476b629722b868ea508cbfe57a586b17fe7ab319a2d89606063f"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.977028 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" event={"ID":"ff24fe21-0695-45eb-a15b-150aa14b1a58","Type":"ContainerStarted","Data":"908c7f37722e5e768723e02a7662baad907a37fe520714663b6ac5a5379dd7a5"} Jan 26 00:09:33 crc kubenswrapper[4874]: I0126 00:09:33.977636 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:33 crc kubenswrapper[4874]: E0126 00:09:33.977927 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.477907708 +0000 UTC m=+149.212014245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.000293 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29489760-gbhtf" event={"ID":"1413e57b-4a4d-4ae9-902f-a1e5fc080588","Type":"ContainerStarted","Data":"a08638902ea697ab404040f515d51807d63becbd73aece700d6ca8415fe34037"} Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.013880 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ks9sj" event={"ID":"92ad92d9-cb0d-4c2a-a19d-e627164e297a","Type":"ContainerStarted","Data":"8c8980edd65420aa4043859e3d6f9cf5cd96e894ff24793fa3b2d1c0702edff4"} Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.013928 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ks9sj" event={"ID":"92ad92d9-cb0d-4c2a-a19d-e627164e297a","Type":"ContainerStarted","Data":"80a2838a49683cf7409eb074f24308b78199a5600ca5ace9bf672b8bd171dc47"} Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.086862 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.088714 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.588694972 +0000 UTC m=+149.322801589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.140163 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" podStartSLOduration=126.140144584 podStartE2EDuration="2m6.140144584s" podCreationTimestamp="2026-01-26 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:34.138741575 +0000 UTC m=+148.872848112" watchObservedRunningTime="2026-01-26 00:09:34.140144584 +0000 UTC m=+148.874251121" Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.179336 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mcbjn"] Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.187868 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.188258 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.688233092 +0000 UTC m=+149.422339629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.199488 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-s6njc"] Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.252169 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-26 00:04:33 +0000 UTC, rotation deadline is 2026-12-03 20:44:32.729784532 +0000 UTC Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.252485 4874 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7484h34m58.477302702s for next certificate rotation Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.290305 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.290688 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.790673273 +0000 UTC m=+149.524779810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.291068 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.304069 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" podStartSLOduration=125.304042846 podStartE2EDuration="2m5.304042846s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:34.301547086 +0000 UTC m=+149.035653623" watchObservedRunningTime="2026-01-26 00:09:34.304042846 +0000 UTC m=+149.038149383" Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.328611 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7wd7t" podStartSLOduration=125.328589559 podStartE2EDuration="2m5.328589559s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:34.328503536 +0000 UTC m=+149.062610103" watchObservedRunningTime="2026-01-26 00:09:34.328589559 +0000 UTC m=+149.062696096" Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.391334 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.391620 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.891600783 +0000 UTC m=+149.625707320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.496616 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.496950 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:34.996938504 +0000 UTC m=+149.731045041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: W0126 00:09:34.553602 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f0e3a2_429a_4997_bd5d_a7d808d97282.slice/crio-197e7a56c6dd3bdfea5b9e360d9e8f77701e1fd97b77f81a4fd5d83cfd024842 WatchSource:0}: Error finding container 197e7a56c6dd3bdfea5b9e360d9e8f77701e1fd97b77f81a4fd5d83cfd024842: Status 404 returned error can't find the container with id 197e7a56c6dd3bdfea5b9e360d9e8f77701e1fd97b77f81a4fd5d83cfd024842 Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.597499 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.598469 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.09843898 +0000 UTC m=+149.832545517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.701858 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.702317 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.20230058 +0000 UTC m=+149.936407117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.803883 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.804264 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.304246068 +0000 UTC m=+150.038352605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.809314 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zdmd8" podStartSLOduration=126.809293298 podStartE2EDuration="2m6.809293298s" podCreationTimestamp="2026-01-26 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:34.809016021 +0000 UTC m=+149.543122568" watchObservedRunningTime="2026-01-26 00:09:34.809293298 +0000 UTC m=+149.543399835" Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.827571 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:34 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:34 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:34 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.827625 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:34 crc kubenswrapper[4874]: I0126 00:09:34.905486 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:34 crc kubenswrapper[4874]: E0126 00:09:34.906101 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.406086152 +0000 UTC m=+150.140192689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.011145 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.012674 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.512645188 +0000 UTC m=+150.246751725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.013246 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.013613 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.513602185 +0000 UTC m=+150.247708712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.020813 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" podStartSLOduration=126.020794155 podStartE2EDuration="2m6.020794155s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:34.981063649 +0000 UTC m=+149.715170186" watchObservedRunningTime="2026-01-26 00:09:35.020794155 +0000 UTC m=+149.754900692" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.054578 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d56td" event={"ID":"7adc2094-bd76-49f1-a373-5cbe2212ec76","Type":"ContainerStarted","Data":"4ac073cb145a17b3fe373dcd342964df2a8b5448c8053c47aec3b50414d5d235"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.069169 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s6njc" event={"ID":"b7ea5432-a092-4b86-99c0-04ec6257f221","Type":"ContainerStarted","Data":"274af2c04c3fc00dcd15423648fbf6cb28e45604772ce8bfcd3dc614ec1d22e5"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.070558 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x7ct8" event={"ID":"277abf5e-0d7e-469b-916e-d6d8bb7b75d3","Type":"ContainerStarted","Data":"ecbb71d8164b65c86b383693d0b37d938c7f66504998003d263c21ffe27e8110"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.070584 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x7ct8" event={"ID":"277abf5e-0d7e-469b-916e-d6d8bb7b75d3","Type":"ContainerStarted","Data":"7872657bfd707279231d29d9b0f921e654acb305ed057762851f6ef31edfa673"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.080145 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ks9sj" podStartSLOduration=126.080128987 podStartE2EDuration="2m6.080128987s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.077888204 +0000 UTC m=+149.811994741" watchObservedRunningTime="2026-01-26 00:09:35.080128987 +0000 UTC m=+149.814235524" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.081306 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-87jnw" podStartSLOduration=126.081299459 podStartE2EDuration="2m6.081299459s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.024120218 +0000 UTC m=+149.758226755" watchObservedRunningTime="2026-01-26 00:09:35.081299459 +0000 UTC m=+149.815405996" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.097647 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" event={"ID":"9dc8424a-928a-4d4d-a1e7-02429361ecfb","Type":"ContainerStarted","Data":"9b023d0e9f00d719be19a884aaf10690fdb75c096d3108e5ed1bcb1a824fe007"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.113870 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8ffn"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.114281 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.115434 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.615417439 +0000 UTC m=+150.349523976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.118007 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" event={"ID":"5b7528a0-dc0c-4496-9209-81aecb79f639","Type":"ContainerStarted","Data":"bb7bda64bcd5d2e14f5eaab0477d104c2e504dc1d925fa34c4a25b2da24625e3"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.118041 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" event={"ID":"5b7528a0-dc0c-4496-9209-81aecb79f639","Type":"ContainerStarted","Data":"0c41878024b740f7a3fa113a45edba0d5cd0414d40cf66fb837a64fca0207b39"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.132085 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" event={"ID":"02f0e3a2-429a-4997-bd5d-a7d808d97282","Type":"ContainerStarted","Data":"197e7a56c6dd3bdfea5b9e360d9e8f77701e1fd97b77f81a4fd5d83cfd024842"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.135258 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29489760-gbhtf" podStartSLOduration=126.1352339 podStartE2EDuration="2m6.1352339s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.126384444 +0000 UTC m=+149.860490991" watchObservedRunningTime="2026-01-26 00:09:35.1352339 +0000 UTC m=+149.869340437" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.141064 4874 generic.go:334] "Generic (PLEG): container finished" podID="09eba8a2-56e9-4722-9d7f-c041b2ab6d36" containerID="c49f0eb9329d2bb7f2343630032af598f59f77544cd7c20210a1eaf9b6dd0787" exitCode=0 Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.141134 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" event={"ID":"09eba8a2-56e9-4722-9d7f-c041b2ab6d36","Type":"ContainerDied","Data":"c49f0eb9329d2bb7f2343630032af598f59f77544cd7c20210a1eaf9b6dd0787"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.191397 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" event={"ID":"b8e09f0d-63bf-4b8d-9485-e884d87eb6a6","Type":"ContainerStarted","Data":"a8361bdf150baf42444ffaac70f79c0b777b92ea06cd2e0a32bc9cf1960e6d50"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.215572 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmxkb" podStartSLOduration=126.215543006 podStartE2EDuration="2m6.215543006s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.20347772 +0000 UTC m=+149.937584257" watchObservedRunningTime="2026-01-26 00:09:35.215543006 +0000 UTC m=+149.949649553" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.224091 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.226891 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.726874741 +0000 UTC m=+150.460981278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.228303 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.230567 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" event={"ID":"6a6a602f-63bf-44ed-bb17-f1965f4be17b","Type":"ContainerStarted","Data":"d2cd12b057cd808b5388262fd6d93d57250ffa04b48246e0f2d11e651f3e37e2"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.230604 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" event={"ID":"6a6a602f-63bf-44ed-bb17-f1965f4be17b","Type":"ContainerStarted","Data":"92ae3f179086f79bfbbe56c52abb7f483f3f2fd9245f941d4baf7f6597f6a72b"} Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.234120 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ks9sj" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.268273 4874 patch_prober.go:28] interesting pod/downloads-7954f5f757-ks9sj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.268343 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ks9sj" podUID="92ad92d9-cb0d-4c2a-a19d-e627164e297a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.303644 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tmxht" podStartSLOduration=126.303627467 podStartE2EDuration="2m6.303627467s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.26346666 +0000 UTC m=+149.997573197" watchObservedRunningTime="2026-01-26 00:09:35.303627467 +0000 UTC m=+150.037734004" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.305153 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:35 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:35 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:35 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.305200 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.326275 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.327744 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.827721037 +0000 UTC m=+150.561827574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.346177 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-d56td" podStartSLOduration=127.34616076 podStartE2EDuration="2m7.34616076s" podCreationTimestamp="2026-01-26 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.344572356 +0000 UTC m=+150.078678893" watchObservedRunningTime="2026-01-26 00:09:35.34616076 +0000 UTC m=+150.080267297" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.347353 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zd6c2" podStartSLOduration=126.347347283 podStartE2EDuration="2m6.347347283s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.305444598 +0000 UTC m=+150.039551155" watchObservedRunningTime="2026-01-26 00:09:35.347347283 +0000 UTC m=+150.081453820" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.390411 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-56tlt" podStartSLOduration=126.390382721 podStartE2EDuration="2m6.390382721s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.381612337 +0000 UTC m=+150.115718874" watchObservedRunningTime="2026-01-26 00:09:35.390382721 +0000 UTC m=+150.124489258" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.401698 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4"] Jan 26 00:09:35 crc kubenswrapper[4874]: W0126 00:09:35.421353 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9650625_1b67_4b67_8a69_d8738bf6d762.slice/crio-311af82da54cabf2d49feb83eb3e8328f34da53da740bec7d8c82a9a4724f1b1 WatchSource:0}: Error finding container 311af82da54cabf2d49feb83eb3e8328f34da53da740bec7d8c82a9a4724f1b1: Status 404 returned error can't find the container with id 311af82da54cabf2d49feb83eb3e8328f34da53da740bec7d8c82a9a4724f1b1 Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.422984 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.428107 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.428600 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:35.928585184 +0000 UTC m=+150.662691721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.440973 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lzswr" podStartSLOduration=126.440934738 podStartE2EDuration="2m6.440934738s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.429106259 +0000 UTC m=+150.163212796" watchObservedRunningTime="2026-01-26 00:09:35.440934738 +0000 UTC m=+150.175041275" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.451878 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8k592"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.470809 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x7ct8" podStartSLOduration=6.470790259 podStartE2EDuration="6.470790259s" podCreationTimestamp="2026-01-26 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:35.470550442 +0000 UTC m=+150.204656979" watchObservedRunningTime="2026-01-26 00:09:35.470790259 +0000 UTC m=+150.204896796" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.528752 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.528937 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.028916277 +0000 UTC m=+150.763022814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.529046 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.529661 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.029630987 +0000 UTC m=+150.763737524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.640538 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.640912 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.140888693 +0000 UTC m=+150.874995230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.742579 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.743881 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.2438688 +0000 UTC m=+150.977975337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.794040 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.806109 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.813176 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wp989"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.846381 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.846674 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.346654361 +0000 UTC m=+151.080760898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.857421 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p65x9"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.867485 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.874651 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.882061 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4rsrb"] Jan 26 00:09:35 crc kubenswrapper[4874]: W0126 00:09:35.884193 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4132d14_d050_466d_8c03_71a3fd5f08a5.slice/crio-d6a42a3fc9b27c1695f2978ed63627d07499fff67e13c19c0030c066aa8e9625 WatchSource:0}: Error finding container d6a42a3fc9b27c1695f2978ed63627d07499fff67e13c19c0030c066aa8e9625: Status 404 returned error can't find the container with id d6a42a3fc9b27c1695f2978ed63627d07499fff67e13c19c0030c066aa8e9625 Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.903145 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.907427 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xqz88"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.915362 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ndg7w"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.920863 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.922351 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.928045 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.930214 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn"] Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.931554 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6"] Jan 26 00:09:35 crc kubenswrapper[4874]: W0126 00:09:35.944197 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8fd9d9e_4607_4d4d_95f9_c18c37440d65.slice/crio-f12b596a7998c161f31e564e4897e01abdedfba8b8134957cf2eb9b2e73ff76f WatchSource:0}: Error finding container f12b596a7998c161f31e564e4897e01abdedfba8b8134957cf2eb9b2e73ff76f: Status 404 returned error can't find the container with id f12b596a7998c161f31e564e4897e01abdedfba8b8134957cf2eb9b2e73ff76f Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.947097 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:35 crc kubenswrapper[4874]: I0126 00:09:35.947511 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc"] Jan 26 00:09:35 crc kubenswrapper[4874]: E0126 00:09:35.947974 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.44793741 +0000 UTC m=+151.182043947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: W0126 00:09:36.001362 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42350fca_6366_4ec7_8b0c_2160cb713c80.slice/crio-eb05006d037faff9d2ccf507dac16466222e1b8280c9ce06f77d1d92b97cbc9d WatchSource:0}: Error finding container eb05006d037faff9d2ccf507dac16466222e1b8280c9ce06f77d1d92b97cbc9d: Status 404 returned error can't find the container with id eb05006d037faff9d2ccf507dac16466222e1b8280c9ce06f77d1d92b97cbc9d Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.048045 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.548020256 +0000 UTC m=+151.282126793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.048180 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.048454 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.049037 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.549028904 +0000 UTC m=+151.283135441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: W0126 00:09:36.060094 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85fbf881_a4a3_4e2a_9c33_b332dfcf7d65.slice/crio-2acf9ac1a8b0c8b4c557e35ff353ede4fe7d44fe7fb900f02e39d21746fafeb0 WatchSource:0}: Error finding container 2acf9ac1a8b0c8b4c557e35ff353ede4fe7d44fe7fb900f02e39d21746fafeb0: Status 404 returned error can't find the container with id 2acf9ac1a8b0c8b4c557e35ff353ede4fe7d44fe7fb900f02e39d21746fafeb0 Jan 26 00:09:36 crc kubenswrapper[4874]: W0126 00:09:36.069453 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b287aa8_5b0e_449a_bd5e_8561d3c74287.slice/crio-04979e46f63aa3ec803a169c57604c9dcf86d14dc097be0080d402ca8f2dd75b WatchSource:0}: Error finding container 04979e46f63aa3ec803a169c57604c9dcf86d14dc097be0080d402ca8f2dd75b: Status 404 returned error can't find the container with id 04979e46f63aa3ec803a169c57604c9dcf86d14dc097be0080d402ca8f2dd75b Jan 26 00:09:36 crc kubenswrapper[4874]: W0126 00:09:36.073032 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301f288a_4f83_48c0_aeef_496a3f814ee2.slice/crio-a9262786bb6e30dc4e7812c80aa6c9a6a8ed13be4501a22d751706ec76b49391 WatchSource:0}: Error finding container a9262786bb6e30dc4e7812c80aa6c9a6a8ed13be4501a22d751706ec76b49391: Status 404 returned error can't find the container with id a9262786bb6e30dc4e7812c80aa6c9a6a8ed13be4501a22d751706ec76b49391 Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.150054 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.150574 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.650429896 +0000 UTC m=+151.384536433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.247087 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.247441 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.251219 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.251626 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.751611282 +0000 UTC m=+151.485717819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.262319 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.276393 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wp989" event={"ID":"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb","Type":"ContainerStarted","Data":"ec275b81eb2525cb6f501e80e2c9d460868ff2f5f1850e9d2fdd065b866c6a16"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.289279 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" event={"ID":"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65","Type":"ContainerStarted","Data":"2acf9ac1a8b0c8b4c557e35ff353ede4fe7d44fe7fb900f02e39d21746fafeb0"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.295018 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:36 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:36 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:36 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.295072 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.313817 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" event={"ID":"854a7fc0-a802-478f-98c3-3e1c494c40d9","Type":"ContainerStarted","Data":"0289e3faa14baa9ab913233c738733d6f18867ce85c25d7e92552e86253ffd3d"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.319949 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" event={"ID":"e4132d14-d050-466d-8c03-71a3fd5f08a5","Type":"ContainerStarted","Data":"d6a42a3fc9b27c1695f2978ed63627d07499fff67e13c19c0030c066aa8e9625"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.330380 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.330722 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.351519 4874 patch_prober.go:28] interesting pod/apiserver-76f77b778f-d56td container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]log ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]etcd ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/generic-apiserver-start-informers ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/max-in-flight-filter ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 26 00:09:36 crc kubenswrapper[4874]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 26 00:09:36 crc kubenswrapper[4874]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/project.openshift.io-projectcache ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/openshift.io-startinformers ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 26 00:09:36 crc kubenswrapper[4874]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 26 00:09:36 crc kubenswrapper[4874]: livez check failed Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.351575 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-d56td" podUID="7adc2094-bd76-49f1-a373-5cbe2212ec76" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.352017 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.354219 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.854200707 +0000 UTC m=+151.588307244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.354973 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.355330 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" event={"ID":"09eba8a2-56e9-4722-9d7f-c041b2ab6d36","Type":"ContainerStarted","Data":"22d53dc069a52153ae54a9d45f1334886d0d9edef274e78bce08b00f33f7211c"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.356425 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.356770 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.856760619 +0000 UTC m=+151.590867156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.372253 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" event={"ID":"df3f7986-e9c7-4647-9a1e-d509480a4824","Type":"ContainerStarted","Data":"9fef8f4b6aef51b065874d7643e453e109af3fc401fac717a6db983868b9a6e3"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.419663 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" event={"ID":"be0c7c98-d60e-40cb-bec7-0685c2ef11a2","Type":"ContainerStarted","Data":"4f97fd1f30efbb63edd626c8e73de8d31685597cd532807d97c6a4bdb0138dae"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.427608 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xqz88" event={"ID":"0ddb705c-de1d-440f-826e-b539340b1eca","Type":"ContainerStarted","Data":"c7d8bf6b8428e60fee986b3d72cb4a9a007d0dae5c925ea1025b80b0240a3273"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.458580 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.459666 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:36.959647832 +0000 UTC m=+151.693754369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.475232 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" event={"ID":"165b8f95-910a-46b7-9e16-c3b813e54831","Type":"ContainerStarted","Data":"70708e347da6c077de9a8ab5a53ab784a6d2d9b2644296c67ed7c8461ff9609c"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.504222 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" event={"ID":"5b287aa8-5b0e-449a-bd5e-8561d3c74287","Type":"ContainerStarted","Data":"04979e46f63aa3ec803a169c57604c9dcf86d14dc097be0080d402ca8f2dd75b"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.537266 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" event={"ID":"e02ae627-2219-4dbd-b96e-219b902d594a","Type":"ContainerStarted","Data":"9edbc6465fec9fc75b250154cf966ec315dddd901db3f5de851477c5123e588d"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.545599 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" event={"ID":"3f7e087e-3f78-4e39-81e9-ad6c357bb27b","Type":"ContainerStarted","Data":"c7597069650aeeb961916de9f60ec898c4445346a9ae082bfd2d6cd3c8d30c3d"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.545642 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" event={"ID":"3f7e087e-3f78-4e39-81e9-ad6c357bb27b","Type":"ContainerStarted","Data":"bbb0d2c3a670840487ffda30296eba4e536deeafbbdfe4b20a040d839efcca85"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.558217 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4rsrb" event={"ID":"e8fd9d9e-4607-4d4d-95f9-c18c37440d65","Type":"ContainerStarted","Data":"f12b596a7998c161f31e564e4897e01abdedfba8b8134957cf2eb9b2e73ff76f"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.564584 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" podStartSLOduration=127.564563772 podStartE2EDuration="2m7.564563772s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.56339874 +0000 UTC m=+151.297505277" watchObservedRunningTime="2026-01-26 00:09:36.564563772 +0000 UTC m=+151.298670309" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.565815 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" podStartSLOduration=127.565809687 podStartE2EDuration="2m7.565809687s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.388125522 +0000 UTC m=+151.122232079" watchObservedRunningTime="2026-01-26 00:09:36.565809687 +0000 UTC m=+151.299916224" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.582856 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.583413 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.083401307 +0000 UTC m=+151.817507844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.589373 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" event={"ID":"8aa0d0bc-1d01-402d-a9aa-97a984717116","Type":"ContainerStarted","Data":"334b5aea9863f53fcc5738bcac353ee91b6959e47bdc154b420c1f8104992b4e"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.589431 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" event={"ID":"8aa0d0bc-1d01-402d-a9aa-97a984717116","Type":"ContainerStarted","Data":"6e97477674a8683fa399a0f2fdaa72a70bcd18eac2f3a675d5c59b9f0838e7ad"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.598669 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.604265 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tfbt" podStartSLOduration=127.604233247 podStartE2EDuration="2m7.604233247s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.602721245 +0000 UTC m=+151.336827782" watchObservedRunningTime="2026-01-26 00:09:36.604233247 +0000 UTC m=+151.338339784" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.610305 4874 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t8ffn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.610375 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.630272 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" podStartSLOduration=127.63022887 podStartE2EDuration="2m7.63022887s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.626226409 +0000 UTC m=+151.360332956" watchObservedRunningTime="2026-01-26 00:09:36.63022887 +0000 UTC m=+151.364335407" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.656087 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" event={"ID":"cabfc520-9523-45b8-8be1-57e71788c5a7","Type":"ContainerStarted","Data":"fb02bc9343a8b9df98138be076e434e5ad00b7af5cb06e63183894584a251b7a"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.657765 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.659383 4874 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8k592 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.659563 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.684270 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.685891 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.185871089 +0000 UTC m=+151.919977626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.700840 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" podStartSLOduration=128.700821605 podStartE2EDuration="2m8.700821605s" podCreationTimestamp="2026-01-26 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.697491232 +0000 UTC m=+151.431597789" watchObservedRunningTime="2026-01-26 00:09:36.700821605 +0000 UTC m=+151.434928142" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.736797 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" event={"ID":"a2630ca0-dedc-4fc8-96b7-deddea64e443","Type":"ContainerStarted","Data":"bc9af32a31a733cbe526f4627ee51c6e15e3009c62e687f7524d1c21aa657d57"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.738549 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" event={"ID":"a2630ca0-dedc-4fc8-96b7-deddea64e443","Type":"ContainerStarted","Data":"898d08efea8bb4baf75fabd5e17a2b03a3ae78fdfcfbf3d56532b7505b833142"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.747816 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s6njc" event={"ID":"b7ea5432-a092-4b86-99c0-04ec6257f221","Type":"ContainerStarted","Data":"32bc4932c9c133ab748f2f734dcb34057c72dac6792df4bde746a05c82438579"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.773610 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" event={"ID":"301f288a-4f83-48c0-aeef-496a3f814ee2","Type":"ContainerStarted","Data":"a9262786bb6e30dc4e7812c80aa6c9a6a8ed13be4501a22d751706ec76b49391"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.786500 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.788039 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.288025502 +0000 UTC m=+152.022132039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.793131 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" event={"ID":"e9650625-1b67-4b67-8a69-d8738bf6d762","Type":"ContainerStarted","Data":"c7a9659ec5bffc6ed00efb1e3a7d02528ad0a3392cf51f5433de43fc40fa8313"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.793209 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" event={"ID":"e9650625-1b67-4b67-8a69-d8738bf6d762","Type":"ContainerStarted","Data":"311af82da54cabf2d49feb83eb3e8328f34da53da740bec7d8c82a9a4724f1b1"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.794197 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.808325 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-s6njc" podStartSLOduration=127.808303866 podStartE2EDuration="2m7.808303866s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.800297804 +0000 UTC m=+151.534404361" watchObservedRunningTime="2026-01-26 00:09:36.808303866 +0000 UTC m=+151.542410403" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.808506 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lfgxs" podStartSLOduration=127.808501832 podStartE2EDuration="2m7.808501832s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.771306607 +0000 UTC m=+151.505413144" watchObservedRunningTime="2026-01-26 00:09:36.808501832 +0000 UTC m=+151.542608369" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.811946 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" event={"ID":"e10d7cc7-6177-4a1f-882c-91d6bf3a738f","Type":"ContainerStarted","Data":"ed29fd2e3d131f6cda0b20faa12d03836336f4f878b053279d3db7284b85b5b7"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.813794 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" event={"ID":"c604832e-4a13-4848-9deb-259c130a4881","Type":"ContainerStarted","Data":"1d692b13c5c0b0924c6f765a98a9521a1f6295c160ed8755eeb9a8890053a263"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.818441 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" event={"ID":"02f0e3a2-429a-4997-bd5d-a7d808d97282","Type":"ContainerStarted","Data":"fc6c840c3f408550896c4af465353bae51f97851acb66baf489d5af2774d2883"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.818505 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" event={"ID":"02f0e3a2-429a-4997-bd5d-a7d808d97282","Type":"ContainerStarted","Data":"42fa27d9776c1c336f8cfe753cad5cfd33db3a6a867416df27630742c0d3a898"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.825750 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" podStartSLOduration=127.825721361 podStartE2EDuration="2m7.825721361s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.825726341 +0000 UTC m=+151.559832898" watchObservedRunningTime="2026-01-26 00:09:36.825721361 +0000 UTC m=+151.559827898" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.829170 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" event={"ID":"d43627d2-c1f4-4335-ad78-0f49d3be3aed","Type":"ContainerStarted","Data":"d245c12542addebfecc13495812bae0931564971ebca13cdef32bf2a7bb80df3"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.831686 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" event={"ID":"42350fca-6366-4ec7-8b0c-2160cb713c80","Type":"ContainerStarted","Data":"eb05006d037faff9d2ccf507dac16466222e1b8280c9ce06f77d1d92b97cbc9d"} Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.838569 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z82bh" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.840669 4874 patch_prober.go:28] interesting pod/downloads-7954f5f757-ks9sj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.840725 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ks9sj" podUID="92ad92d9-cb0d-4c2a-a19d-e627164e297a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.842185 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mcbjn" podStartSLOduration=127.842166799 podStartE2EDuration="2m7.842166799s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:36.838294191 +0000 UTC m=+151.572400728" watchObservedRunningTime="2026-01-26 00:09:36.842166799 +0000 UTC m=+151.576273336" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.892254 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lpbc4" Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.892802 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:36 crc kubenswrapper[4874]: E0126 00:09:36.895491 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.395470632 +0000 UTC m=+152.129577169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:36 crc kubenswrapper[4874]: I0126 00:09:36.996648 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.004415 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.504398814 +0000 UTC m=+152.238505351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.098895 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.099496 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.599477121 +0000 UTC m=+152.333583658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.148504 4874 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.200502 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.200931 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.700917994 +0000 UTC m=+152.435024531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.295034 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:37 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:37 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:37 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.295243 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.302368 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.302735 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.802714938 +0000 UTC m=+152.536821465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.404256 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.404650 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:37.904634395 +0000 UTC m=+152.638740932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.506204 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.506326 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:38.006300204 +0000 UTC m=+152.740406741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.506468 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.506842 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:38.006824519 +0000 UTC m=+152.740931056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.608152 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.608435 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:38.108401076 +0000 UTC m=+152.842507613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.608567 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.608948 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:38.108940211 +0000 UTC m=+152.843046748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.709387 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.709588 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:38.209555762 +0000 UTC m=+152.943662299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.709678 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.709983 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-26 00:09:38.209970353 +0000 UTC m=+152.944076890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nmgqb" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.806188 4874 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-26T00:09:37.148535076Z","Handler":null,"Name":""} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.811616 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:37 crc kubenswrapper[4874]: E0126 00:09:37.812069 4874 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-26 00:09:38.312044674 +0000 UTC m=+153.046151211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.812818 4874 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.812857 4874 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.839629 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" event={"ID":"42350fca-6366-4ec7-8b0c-2160cb713c80","Type":"ContainerStarted","Data":"0d5f23f211094b61a77e4ee8726339a77c89a9bfa22eb90fa9f85c5d9aa35a8a"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.841040 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.843942 4874 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sp6fv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.844021 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" podUID="42350fca-6366-4ec7-8b0c-2160cb713c80" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.848643 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nrq2s" event={"ID":"e02ae627-2219-4dbd-b96e-219b902d594a","Type":"ContainerStarted","Data":"fbc6cfb7a7de03135294cd66c48b9f7d04db1d0bb9e815c0f9b03bc3839dbd10"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.877566 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4rsrb" event={"ID":"e8fd9d9e-4607-4d4d-95f9-c18c37440d65","Type":"ContainerStarted","Data":"60a5fd0d7b6ad574206e81a6866b826ca51520c9f9aae95eb03d722160e9827a"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.881400 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" event={"ID":"854a7fc0-a802-478f-98c3-3e1c494c40d9","Type":"ContainerStarted","Data":"4a7d32a617a1de2a0a232f03c65b13b2315d15744c7d0efa4ac104b2580facd6"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.888047 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" event={"ID":"c604832e-4a13-4848-9deb-259c130a4881","Type":"ContainerStarted","Data":"95239ead37a5d4f8f1aeded5535ff0465df4ab757caf577c5f3295f1aa28a799"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.888107 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" event={"ID":"c604832e-4a13-4848-9deb-259c130a4881","Type":"ContainerStarted","Data":"eff299090f14b4d9db3e4e67a7ce6693130bc57d09f0d2058b22f88936eaa14f"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.899059 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" event={"ID":"9dc8424a-928a-4d4d-a1e7-02429361ecfb","Type":"ContainerStarted","Data":"fbe13d706252bf1d7a952617f21746dc56b2f3127ab648436bde304dce684bbf"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.899122 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" event={"ID":"9dc8424a-928a-4d4d-a1e7-02429361ecfb","Type":"ContainerStarted","Data":"7c15ac2c5313266e9c90c4a15ba8f711b0d91201454a17146cede0172b25cabc"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.901214 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" event={"ID":"e4132d14-d050-466d-8c03-71a3fd5f08a5","Type":"ContainerStarted","Data":"86956dcf2e66542b4985fcdfa77a3225e88b78e0f5eb81da68f0690bf61de7b0"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.903292 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4rsrb" podStartSLOduration=8.903280764 podStartE2EDuration="8.903280764s" podCreationTimestamp="2026-01-26 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:37.900713992 +0000 UTC m=+152.634820539" watchObservedRunningTime="2026-01-26 00:09:37.903280764 +0000 UTC m=+152.637387301" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.904142 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" podStartSLOduration=128.904133677 podStartE2EDuration="2m8.904133677s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:37.875415948 +0000 UTC m=+152.609522495" watchObservedRunningTime="2026-01-26 00:09:37.904133677 +0000 UTC m=+152.638240214" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.913581 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.916164 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" event={"ID":"301f288a-4f83-48c0-aeef-496a3f814ee2","Type":"ContainerStarted","Data":"60bb62d93a6659471f29e05495caa2717908aee2a13fe49917e831a03cbebf92"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.916215 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" event={"ID":"301f288a-4f83-48c0-aeef-496a3f814ee2","Type":"ContainerStarted","Data":"6a37f8112c6c87ef47b5118df34389c5c5c2592a97234fcd8d1d12e0e5fcd6b7"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.920281 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" event={"ID":"e10d7cc7-6177-4a1f-882c-91d6bf3a738f","Type":"ContainerStarted","Data":"d895e33e1c4f2f18369af4c44453c5895efd289186cc4af1db93e4ab66c8be33"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.920334 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" event={"ID":"e10d7cc7-6177-4a1f-882c-91d6bf3a738f","Type":"ContainerStarted","Data":"bd5bbddc2fe32a24cc9b01638b11388ec3e9baaa40e1b71c716f88b38288ff93"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.923698 4874 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.923733 4874 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.930545 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zs4sn" podStartSLOduration=128.930534792 podStartE2EDuration="2m8.930534792s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:37.927747875 +0000 UTC m=+152.661854412" watchObservedRunningTime="2026-01-26 00:09:37.930534792 +0000 UTC m=+152.664641329" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.940809 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wp989" event={"ID":"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb","Type":"ContainerStarted","Data":"2bcec1377aa873c1172ae9559234132986c1219a82d3e52d7082865dfd7d8573"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.940851 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wp989" event={"ID":"12d5fc5a-ccab-4fb9-8d96-fb6c119e0ccb","Type":"ContainerStarted","Data":"80008e1710844e35935d9b9f5c4797143c52b1b4c9b7fa00bcb3db022fb62a3a"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.941573 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wp989" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.951419 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" event={"ID":"165b8f95-910a-46b7-9e16-c3b813e54831","Type":"ContainerStarted","Data":"2d735aea1fd21f070a23febf09ba1955bae2350332250fbab36aba5386d2b078"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.951492 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" event={"ID":"165b8f95-910a-46b7-9e16-c3b813e54831","Type":"ContainerStarted","Data":"d192e7900832100d36243297289f6955752cc4a7a65c942308ed456c046b04f3"} Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.965061 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-p65x9" podStartSLOduration=128.965038083 podStartE2EDuration="2m8.965038083s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:37.962488802 +0000 UTC m=+152.696595359" watchObservedRunningTime="2026-01-26 00:09:37.965038083 +0000 UTC m=+152.699144620" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.980739 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nmgqb\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:37 crc kubenswrapper[4874]: I0126 00:09:37.988017 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" event={"ID":"d43627d2-c1f4-4335-ad78-0f49d3be3aed","Type":"ContainerStarted","Data":"9620b7dadb1e630cd62ce322b8dab6b4181b3742a4326146e26316dda34b4a73"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.003262 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" event={"ID":"df3f7986-e9c7-4647-9a1e-d509480a4824","Type":"ContainerStarted","Data":"51e7e864cf7dd5e4ee32c890ba0b53217b0bc6dfed81211b9507d06b6e70f1b8"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.004071 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gjzmd" podStartSLOduration=129.004015758 podStartE2EDuration="2m9.004015758s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.002610229 +0000 UTC m=+152.736716766" watchObservedRunningTime="2026-01-26 00:09:38.004015758 +0000 UTC m=+152.738122295" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.009175 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.017250 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xqz88" event={"ID":"0ddb705c-de1d-440f-826e-b539340b1eca","Type":"ContainerStarted","Data":"d30d980775791aa458379025f611a47783a374f7f854262d2a0b3574713930dc"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.018375 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.018804 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.026163 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fr5kd" podStartSLOduration=129.026147534 podStartE2EDuration="2m9.026147534s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.023817529 +0000 UTC m=+152.757924066" watchObservedRunningTime="2026-01-26 00:09:38.026147534 +0000 UTC m=+152.760254061" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.036135 4874 patch_prober.go:28] interesting pod/console-operator-58897d9998-xqz88 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.036212 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xqz88" podUID="0ddb705c-de1d-440f-826e-b539340b1eca" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.042441 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.046581 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" event={"ID":"5b287aa8-5b0e-449a-bd5e-8561d3c74287","Type":"ContainerStarted","Data":"3b8dcc34c16a2f9abcd8249bc75e28485f9c3661467ca25ce3ad12ca7a3fd4f0"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.071825 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" event={"ID":"85fbf881-a4a3-4e2a-9c33-b332dfcf7d65","Type":"ContainerStarted","Data":"c281673c7eb8307e96a99aae33afc1bea257f6eed28ea75b3ec0c4d2df7e8c2b"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.083322 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" event={"ID":"be0c7c98-d60e-40cb-bec7-0685c2ef11a2","Type":"ContainerStarted","Data":"fcfa0af9f77cbc4215d8ee72ad035100ee5e9e25d9d02791667eb3688ca3676e"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.083375 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" event={"ID":"be0c7c98-d60e-40cb-bec7-0685c2ef11a2","Type":"ContainerStarted","Data":"3f9577c542161c715cd911d55142e75b75f0a1179c9a381f9f94252a7c0b47ed"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.084107 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.084363 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.090002 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" podStartSLOduration=130.089989571 podStartE2EDuration="2m10.089989571s" podCreationTimestamp="2026-01-26 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.060434108 +0000 UTC m=+152.794540635" watchObservedRunningTime="2026-01-26 00:09:38.089989571 +0000 UTC m=+152.824096108" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.090800 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wp989" podStartSLOduration=9.090795893 podStartE2EDuration="9.090795893s" podCreationTimestamp="2026-01-26 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.088364415 +0000 UTC m=+152.822470952" watchObservedRunningTime="2026-01-26 00:09:38.090795893 +0000 UTC m=+152.824902430" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.092102 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" event={"ID":"cabfc520-9523-45b8-8be1-57e71788c5a7","Type":"ContainerStarted","Data":"8a60f1ef8836ef1b6b065e9b7b3a29c41522c2c6792c36df10d2d8138eff9947"} Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.103931 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.109374 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7qhd5" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.178701 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t75tb" podStartSLOduration=129.178680379 podStartE2EDuration="2m9.178680379s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.175786889 +0000 UTC m=+152.909893426" watchObservedRunningTime="2026-01-26 00:09:38.178680379 +0000 UTC m=+152.912786906" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.229315 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ndg7w" podStartSLOduration=129.229297848 podStartE2EDuration="2m9.229297848s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.227365494 +0000 UTC m=+152.961472041" watchObservedRunningTime="2026-01-26 00:09:38.229297848 +0000 UTC m=+152.963404385" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.266032 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.299654 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:38 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:38 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:38 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.300336 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.340354 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xqz88" podStartSLOduration=129.340331828 podStartE2EDuration="2m9.340331828s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.339428543 +0000 UTC m=+153.073535080" watchObservedRunningTime="2026-01-26 00:09:38.340331828 +0000 UTC m=+153.074438365" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.429670 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2cnr6" podStartSLOduration=129.429648315 podStartE2EDuration="2m9.429648315s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.42408221 +0000 UTC m=+153.158188767" watchObservedRunningTime="2026-01-26 00:09:38.429648315 +0000 UTC m=+153.163754852" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.539502 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" podStartSLOduration=129.539480892 podStartE2EDuration="2m9.539480892s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.511632646 +0000 UTC m=+153.245739183" watchObservedRunningTime="2026-01-26 00:09:38.539480892 +0000 UTC m=+153.273587429" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.562249 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwrvc" podStartSLOduration=129.562232025 podStartE2EDuration="2m9.562232025s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.560871767 +0000 UTC m=+153.294978304" watchObservedRunningTime="2026-01-26 00:09:38.562232025 +0000 UTC m=+153.296338562" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.562871 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7nm8d" podStartSLOduration=129.562866792 podStartE2EDuration="2m9.562866792s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:38.538860044 +0000 UTC m=+153.272966671" watchObservedRunningTime="2026-01-26 00:09:38.562866792 +0000 UTC m=+153.296973329" Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.705344 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nmgqb"] Jan 26 00:09:38 crc kubenswrapper[4874]: I0126 00:09:38.905073 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.097146 4874 generic.go:334] "Generic (PLEG): container finished" podID="e4132d14-d050-466d-8c03-71a3fd5f08a5" containerID="86956dcf2e66542b4985fcdfa77a3225e88b78e0f5eb81da68f0690bf61de7b0" exitCode=0 Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.097220 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" event={"ID":"e4132d14-d050-466d-8c03-71a3fd5f08a5","Type":"ContainerDied","Data":"86956dcf2e66542b4985fcdfa77a3225e88b78e0f5eb81da68f0690bf61de7b0"} Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.098566 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" event={"ID":"5e4be981-c476-4a3d-a3a5-85e3a30b8376","Type":"ContainerStarted","Data":"90099ad346900b5918821d2af4946f8ce68106e3491f3c75433cfd570a313e01"} Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.098605 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" event={"ID":"5e4be981-c476-4a3d-a3a5-85e3a30b8376","Type":"ContainerStarted","Data":"8cd09f0357449a8c1b7e71e944a43afc04a539c47ce44f02dd7a0ccef6103f08"} Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.098704 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.100928 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" event={"ID":"9dc8424a-928a-4d4d-a1e7-02429361ecfb","Type":"ContainerStarted","Data":"7f24c728936ca98b256d9950dbe6c0b43a8037a410cac85a3f7da8b7935e1cb6"} Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.195032 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" podStartSLOduration=130.195002776 podStartE2EDuration="2m10.195002776s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:39.18903205 +0000 UTC m=+153.923138597" watchObservedRunningTime="2026-01-26 00:09:39.195002776 +0000 UTC m=+153.929109313" Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.196147 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-k7t65" podStartSLOduration=10.196140568 podStartE2EDuration="10.196140568s" podCreationTimestamp="2026-01-26 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:39.166204274 +0000 UTC m=+153.900310811" watchObservedRunningTime="2026-01-26 00:09:39.196140568 +0000 UTC m=+153.930247105" Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.281675 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp6fv" Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.293106 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:39 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:39 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:39 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.293378 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:39 crc kubenswrapper[4874]: I0126 00:09:39.630745 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.032931 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xqz88" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.290438 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:40 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:40 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:40 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.290495 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.309416 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.312004 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ps2v4"] Jan 26 00:09:40 crc kubenswrapper[4874]: E0126 00:09:40.312251 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4132d14-d050-466d-8c03-71a3fd5f08a5" containerName="collect-profiles" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.312272 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4132d14-d050-466d-8c03-71a3fd5f08a5" containerName="collect-profiles" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.312421 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4132d14-d050-466d-8c03-71a3fd5f08a5" containerName="collect-profiles" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.313252 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.315145 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.322328 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ps2v4"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.353901 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4132d14-d050-466d-8c03-71a3fd5f08a5-secret-volume\") pod \"e4132d14-d050-466d-8c03-71a3fd5f08a5\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.353944 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4132d14-d050-466d-8c03-71a3fd5f08a5-config-volume\") pod \"e4132d14-d050-466d-8c03-71a3fd5f08a5\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.354033 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtcwg\" (UniqueName: \"kubernetes.io/projected/e4132d14-d050-466d-8c03-71a3fd5f08a5-kube-api-access-xtcwg\") pod \"e4132d14-d050-466d-8c03-71a3fd5f08a5\" (UID: \"e4132d14-d050-466d-8c03-71a3fd5f08a5\") " Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.354214 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-utilities\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.354233 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jmh\" (UniqueName: \"kubernetes.io/projected/259aa26b-333f-4b21-8bae-8494dbc94b50-kube-api-access-k8jmh\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.354297 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-catalog-content\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.354820 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4132d14-d050-466d-8c03-71a3fd5f08a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "e4132d14-d050-466d-8c03-71a3fd5f08a5" (UID: "e4132d14-d050-466d-8c03-71a3fd5f08a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.360929 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4132d14-d050-466d-8c03-71a3fd5f08a5-kube-api-access-xtcwg" (OuterVolumeSpecName: "kube-api-access-xtcwg") pod "e4132d14-d050-466d-8c03-71a3fd5f08a5" (UID: "e4132d14-d050-466d-8c03-71a3fd5f08a5"). InnerVolumeSpecName "kube-api-access-xtcwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.372766 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4132d14-d050-466d-8c03-71a3fd5f08a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e4132d14-d050-466d-8c03-71a3fd5f08a5" (UID: "e4132d14-d050-466d-8c03-71a3fd5f08a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.455344 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-utilities\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.455395 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jmh\" (UniqueName: \"kubernetes.io/projected/259aa26b-333f-4b21-8bae-8494dbc94b50-kube-api-access-k8jmh\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.455458 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-catalog-content\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.455495 4874 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4132d14-d050-466d-8c03-71a3fd5f08a5-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.455508 4874 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4132d14-d050-466d-8c03-71a3fd5f08a5-config-volume\") on node \"crc\" DevicePath \"\"" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.455518 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtcwg\" (UniqueName: \"kubernetes.io/projected/e4132d14-d050-466d-8c03-71a3fd5f08a5-kube-api-access-xtcwg\") on node \"crc\" DevicePath \"\"" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.455991 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-utilities\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.456014 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-catalog-content\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.470974 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jmh\" (UniqueName: \"kubernetes.io/projected/259aa26b-333f-4b21-8bae-8494dbc94b50-kube-api-access-k8jmh\") pod \"certified-operators-ps2v4\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.510088 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r4fgs"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.511168 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.513008 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.519357 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4fgs"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.556420 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-utilities\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.556463 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfx9\" (UniqueName: \"kubernetes.io/projected/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-kube-api-access-gcfx9\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.556613 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-catalog-content\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.634571 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.658188 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-utilities\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.658242 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfx9\" (UniqueName: \"kubernetes.io/projected/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-kube-api-access-gcfx9\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.658341 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-catalog-content\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.658722 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-utilities\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.658751 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-catalog-content\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.686641 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfx9\" (UniqueName: \"kubernetes.io/projected/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-kube-api-access-gcfx9\") pod \"community-operators-r4fgs\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.711554 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-llwxh"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.713612 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.728606 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llwxh"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.760201 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-utilities\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.760257 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-catalog-content\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.760309 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwgr\" (UniqueName: \"kubernetes.io/projected/ff351744-6cfe-455c-851d-c8e218efee0e-kube-api-access-mtwgr\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.839839 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.861133 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwgr\" (UniqueName: \"kubernetes.io/projected/ff351744-6cfe-455c-851d-c8e218efee0e-kube-api-access-mtwgr\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.861252 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-utilities\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.861275 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-catalog-content\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.861762 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-utilities\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.862109 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-catalog-content\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.867978 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ps2v4"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.884606 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwgr\" (UniqueName: \"kubernetes.io/projected/ff351744-6cfe-455c-851d-c8e218efee0e-kube-api-access-mtwgr\") pod \"certified-operators-llwxh\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.910427 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qx779"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.919754 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.932338 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qx779"] Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.962608 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-catalog-content\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.962745 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdpz\" (UniqueName: \"kubernetes.io/projected/f11f67a6-370d-4591-bebd-152ad8e10326-kube-api-access-bqdpz\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:40 crc kubenswrapper[4874]: I0126 00:09:40.962859 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-utilities\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.047649 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4fgs"] Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.051330 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:09:41 crc kubenswrapper[4874]: W0126 00:09:41.052200 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87afa5b5_ce89_41ab_8fe6_cb82acb882aa.slice/crio-5af821e5864d2182b0a3e1878b69ed7d168270ad6aaca564b3447c252da847fa WatchSource:0}: Error finding container 5af821e5864d2182b0a3e1878b69ed7d168270ad6aaca564b3447c252da847fa: Status 404 returned error can't find the container with id 5af821e5864d2182b0a3e1878b69ed7d168270ad6aaca564b3447c252da847fa Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.063513 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-catalog-content\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.063573 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdpz\" (UniqueName: \"kubernetes.io/projected/f11f67a6-370d-4591-bebd-152ad8e10326-kube-api-access-bqdpz\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.063624 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-utilities\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.064024 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-catalog-content\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.064085 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-utilities\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.095582 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdpz\" (UniqueName: \"kubernetes.io/projected/f11f67a6-370d-4591-bebd-152ad8e10326-kube-api-access-bqdpz\") pod \"community-operators-qx779\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.115158 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" event={"ID":"e4132d14-d050-466d-8c03-71a3fd5f08a5","Type":"ContainerDied","Data":"d6a42a3fc9b27c1695f2978ed63627d07499fff67e13c19c0030c066aa8e9625"} Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.115212 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6a42a3fc9b27c1695f2978ed63627d07499fff67e13c19c0030c066aa8e9625" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.115365 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489760-c46dr" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.116120 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fgs" event={"ID":"87afa5b5-ce89-41ab-8fe6-cb82acb882aa","Type":"ContainerStarted","Data":"5af821e5864d2182b0a3e1878b69ed7d168270ad6aaca564b3447c252da847fa"} Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.129586 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ps2v4" event={"ID":"259aa26b-333f-4b21-8bae-8494dbc94b50","Type":"ContainerStarted","Data":"e9bb97a55c863a7c5865a423da09e85b519c309c916cb2171f0bda103e6ed9e3"} Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.260535 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llwxh"] Jan 26 00:09:41 crc kubenswrapper[4874]: W0126 00:09:41.270375 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff351744_6cfe_455c_851d_c8e218efee0e.slice/crio-9a4d1a0a7b66b4aca57010deebe6f4f4fa6827ce43b54eeda932f2a28d75b44a WatchSource:0}: Error finding container 9a4d1a0a7b66b4aca57010deebe6f4f4fa6827ce43b54eeda932f2a28d75b44a: Status 404 returned error can't find the container with id 9a4d1a0a7b66b4aca57010deebe6f4f4fa6827ce43b54eeda932f2a28d75b44a Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.282572 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.291324 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:41 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:41 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:41 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.291395 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.338985 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.343626 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-d56td" Jan 26 00:09:41 crc kubenswrapper[4874]: I0126 00:09:41.653825 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qx779"] Jan 26 00:09:41 crc kubenswrapper[4874]: W0126 00:09:41.662575 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11f67a6_370d_4591_bebd_152ad8e10326.slice/crio-a8e9a1b1ca744d78a447b24983bfafb8db9bec320c01f4373a60a15a788749bc WatchSource:0}: Error finding container a8e9a1b1ca744d78a447b24983bfafb8db9bec320c01f4373a60a15a788749bc: Status 404 returned error can't find the container with id a8e9a1b1ca744d78a447b24983bfafb8db9bec320c01f4373a60a15a788749bc Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.137243 4874 generic.go:334] "Generic (PLEG): container finished" podID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerID="fa18429a90e672674b8b6627b229105d2fdec8d60b7e83af4abae8c62274d903" exitCode=0 Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.137339 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ps2v4" event={"ID":"259aa26b-333f-4b21-8bae-8494dbc94b50","Type":"ContainerDied","Data":"fa18429a90e672674b8b6627b229105d2fdec8d60b7e83af4abae8c62274d903"} Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.138747 4874 generic.go:334] "Generic (PLEG): container finished" podID="f11f67a6-370d-4591-bebd-152ad8e10326" containerID="cd10a81d45966b3ca36847c3a9fd8dab2471d6070551eb4fd5fed74cac2cdfa5" exitCode=0 Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.138789 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx779" event={"ID":"f11f67a6-370d-4591-bebd-152ad8e10326","Type":"ContainerDied","Data":"cd10a81d45966b3ca36847c3a9fd8dab2471d6070551eb4fd5fed74cac2cdfa5"} Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.138814 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx779" event={"ID":"f11f67a6-370d-4591-bebd-152ad8e10326","Type":"ContainerStarted","Data":"a8e9a1b1ca744d78a447b24983bfafb8db9bec320c01f4373a60a15a788749bc"} Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.140252 4874 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.141395 4874 generic.go:334] "Generic (PLEG): container finished" podID="ff351744-6cfe-455c-851d-c8e218efee0e" containerID="d877e7285c966607ab2e3833349b1c93314219991a5a072336334f3a4a4e95f6" exitCode=0 Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.141460 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llwxh" event={"ID":"ff351744-6cfe-455c-851d-c8e218efee0e","Type":"ContainerDied","Data":"d877e7285c966607ab2e3833349b1c93314219991a5a072336334f3a4a4e95f6"} Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.141482 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llwxh" event={"ID":"ff351744-6cfe-455c-851d-c8e218efee0e","Type":"ContainerStarted","Data":"9a4d1a0a7b66b4aca57010deebe6f4f4fa6827ce43b54eeda932f2a28d75b44a"} Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.144095 4874 generic.go:334] "Generic (PLEG): container finished" podID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerID="d3ee6f5c7b6505d9082eb619b62ffd91443f40d4f302bbb3d4affb339e36a127" exitCode=0 Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.144122 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fgs" event={"ID":"87afa5b5-ce89-41ab-8fe6-cb82acb882aa","Type":"ContainerDied","Data":"d3ee6f5c7b6505d9082eb619b62ffd91443f40d4f302bbb3d4affb339e36a127"} Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.210314 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ks9sj" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.288059 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.292084 4874 patch_prober.go:28] interesting pod/router-default-5444994796-tmxht container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 26 00:09:42 crc kubenswrapper[4874]: [-]has-synced failed: reason withheld Jan 26 00:09:42 crc kubenswrapper[4874]: [+]process-running ok Jan 26 00:09:42 crc kubenswrapper[4874]: healthz check failed Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.292140 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tmxht" podUID="aa444927-3d15-42ba-b2aa-0ea838d8726a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.315178 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s2dns"] Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.316545 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.318451 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.335820 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2dns"] Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.337536 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.338236 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.341251 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.341564 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.354899 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.481970 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-catalog-content\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.482538 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-utilities\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.482648 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b87f168e-059d-49d7-ab80-ed09ce2333c8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.482781 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b87f168e-059d-49d7-ab80-ed09ce2333c8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.482832 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56c6b\" (UniqueName: \"kubernetes.io/projected/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-kube-api-access-56c6b\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.584275 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b87f168e-059d-49d7-ab80-ed09ce2333c8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.584345 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56c6b\" (UniqueName: \"kubernetes.io/projected/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-kube-api-access-56c6b\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.584384 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-catalog-content\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.584404 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-utilities\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.584436 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b87f168e-059d-49d7-ab80-ed09ce2333c8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.584502 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b87f168e-059d-49d7-ab80-ed09ce2333c8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.584991 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-catalog-content\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.585213 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-utilities\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.605699 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b87f168e-059d-49d7-ab80-ed09ce2333c8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.615165 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56c6b\" (UniqueName: \"kubernetes.io/projected/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-kube-api-access-56c6b\") pod \"redhat-marketplace-s2dns\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.636120 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.653352 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.706837 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m7zdk"] Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.707851 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.723165 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7zdk"] Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.828171 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.829290 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.833188 4874 patch_prober.go:28] interesting pod/console-f9d7485db-s6njc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.833252 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-s6njc" podUID="b7ea5432-a092-4b86-99c0-04ec6257f221" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.890343 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-catalog-content\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.890421 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-utilities\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.890507 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2mq5\" (UniqueName: \"kubernetes.io/projected/b7b05ff9-4558-42dc-9ca7-64d2342a002c-kube-api-access-z2mq5\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.896367 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2dns"] Jan 26 00:09:42 crc kubenswrapper[4874]: W0126 00:09:42.905019 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a28b4a4_b5c7_4f8f_9577_a65d9148efb5.slice/crio-774c82869774b1bfbcc58a500b4be436c66dcbfe975967271f0cb6a3c7f7f663 WatchSource:0}: Error finding container 774c82869774b1bfbcc58a500b4be436c66dcbfe975967271f0cb6a3c7f7f663: Status 404 returned error can't find the container with id 774c82869774b1bfbcc58a500b4be436c66dcbfe975967271f0cb6a3c7f7f663 Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.993639 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-catalog-content\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.993722 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-utilities\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.994477 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2mq5\" (UniqueName: \"kubernetes.io/projected/b7b05ff9-4558-42dc-9ca7-64d2342a002c-kube-api-access-z2mq5\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.994693 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-catalog-content\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:42 crc kubenswrapper[4874]: I0126 00:09:42.995291 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-utilities\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.021219 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2mq5\" (UniqueName: \"kubernetes.io/projected/b7b05ff9-4558-42dc-9ca7-64d2342a002c-kube-api-access-z2mq5\") pod \"redhat-marketplace-m7zdk\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.030108 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.164228 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2dns" event={"ID":"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5","Type":"ContainerStarted","Data":"2c77ceb39ee7dcfd56f9edf137cfa7bc8aaa61e168d9f826089daab3586ddbf8"} Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.164456 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2dns" event={"ID":"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5","Type":"ContainerStarted","Data":"774c82869774b1bfbcc58a500b4be436c66dcbfe975967271f0cb6a3c7f7f663"} Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.212020 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.296335 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.327620 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tmxht" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.518828 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bdrl5"] Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.521590 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.530900 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.534090 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7zdk"] Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.536521 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdrl5"] Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.711128 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-catalog-content\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.711187 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxv7\" (UniqueName: \"kubernetes.io/projected/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-kube-api-access-xxxv7\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.711214 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-utilities\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.812275 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-catalog-content\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.812325 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxv7\" (UniqueName: \"kubernetes.io/projected/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-kube-api-access-xxxv7\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.812356 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-utilities\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.812774 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-catalog-content\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.812804 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-utilities\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.858251 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxv7\" (UniqueName: \"kubernetes.io/projected/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-kube-api-access-xxxv7\") pod \"redhat-operators-bdrl5\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.869789 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.915644 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fphsx"] Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.918403 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:43 crc kubenswrapper[4874]: I0126 00:09:43.931438 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fphsx"] Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.118511 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-utilities\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.118657 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nbnp\" (UniqueName: \"kubernetes.io/projected/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-kube-api-access-5nbnp\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.118728 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-catalog-content\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.192717 4874 generic.go:334] "Generic (PLEG): container finished" podID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerID="2c77ceb39ee7dcfd56f9edf137cfa7bc8aaa61e168d9f826089daab3586ddbf8" exitCode=0 Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.192814 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2dns" event={"ID":"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5","Type":"ContainerDied","Data":"2c77ceb39ee7dcfd56f9edf137cfa7bc8aaa61e168d9f826089daab3586ddbf8"} Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.198858 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b87f168e-059d-49d7-ab80-ed09ce2333c8","Type":"ContainerStarted","Data":"3dcad6e05270a0b4adf3ef649288125cbef4eb4b0ba87f023727d8d06876e56b"} Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.210088 4874 generic.go:334] "Generic (PLEG): container finished" podID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerID="fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7" exitCode=0 Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.210920 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7zdk" event={"ID":"b7b05ff9-4558-42dc-9ca7-64d2342a002c","Type":"ContainerDied","Data":"fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7"} Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.210953 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7zdk" event={"ID":"b7b05ff9-4558-42dc-9ca7-64d2342a002c","Type":"ContainerStarted","Data":"5636a3469c98cb2cea3ccc2c112885d035c7db0bc293d21c6949f2be716e5ceb"} Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.219506 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-utilities\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.219562 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbnp\" (UniqueName: \"kubernetes.io/projected/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-kube-api-access-5nbnp\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.219605 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-catalog-content\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.220339 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-utilities\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.222934 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-catalog-content\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.240879 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nbnp\" (UniqueName: \"kubernetes.io/projected/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-kube-api-access-5nbnp\") pod \"redhat-operators-fphsx\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.297210 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.318093 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdrl5"] Jan 26 00:09:44 crc kubenswrapper[4874]: W0126 00:09:44.403419 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde4fdb5_abe2_43bf_bf7d_8460bb1a4ea7.slice/crio-a50314309f7a7fe466d7ed00fdde026d0ee33227802b2223e7673a038fae429b WatchSource:0}: Error finding container a50314309f7a7fe466d7ed00fdde026d0ee33227802b2223e7673a038fae429b: Status 404 returned error can't find the container with id a50314309f7a7fe466d7ed00fdde026d0ee33227802b2223e7673a038fae429b Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.447497 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.448668 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.454905 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.455197 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.466579 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.534094 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.534208 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.637340 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.637995 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.637456 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.661844 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.750748 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fphsx"] Jan 26 00:09:44 crc kubenswrapper[4874]: I0126 00:09:44.801092 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:44 crc kubenswrapper[4874]: W0126 00:09:44.852306 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce11dbd9_2bd4_40a6_9735_2c3489511fc6.slice/crio-48b66f5e52ae25457aa71ebe3a0a836d0ed85a066bd26ad1a5bb0c0e5ff8b1d7 WatchSource:0}: Error finding container 48b66f5e52ae25457aa71ebe3a0a836d0ed85a066bd26ad1a5bb0c0e5ff8b1d7: Status 404 returned error can't find the container with id 48b66f5e52ae25457aa71ebe3a0a836d0ed85a066bd26ad1a5bb0c0e5ff8b1d7 Jan 26 00:09:45 crc kubenswrapper[4874]: I0126 00:09:45.218624 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdrl5" event={"ID":"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7","Type":"ContainerStarted","Data":"a50314309f7a7fe466d7ed00fdde026d0ee33227802b2223e7673a038fae429b"} Jan 26 00:09:45 crc kubenswrapper[4874]: I0126 00:09:45.220982 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fphsx" event={"ID":"ce11dbd9-2bd4-40a6-9735-2c3489511fc6","Type":"ContainerStarted","Data":"48b66f5e52ae25457aa71ebe3a0a836d0ed85a066bd26ad1a5bb0c0e5ff8b1d7"} Jan 26 00:09:45 crc kubenswrapper[4874]: I0126 00:09:45.241651 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.228558 4874 generic.go:334] "Generic (PLEG): container finished" podID="b87f168e-059d-49d7-ab80-ed09ce2333c8" containerID="dab165158bb1d4863419fe4d186c6487bc75abd7a19fbced3fe23ae0546a43ab" exitCode=0 Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.228650 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b87f168e-059d-49d7-ab80-ed09ce2333c8","Type":"ContainerDied","Data":"dab165158bb1d4863419fe4d186c6487bc75abd7a19fbced3fe23ae0546a43ab"} Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.237212 4874 generic.go:334] "Generic (PLEG): container finished" podID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerID="c96e65ad44fe7b35b346683fa52eb2210d29a75718982663c3314db1a338dc1b" exitCode=0 Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.237256 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdrl5" event={"ID":"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7","Type":"ContainerDied","Data":"c96e65ad44fe7b35b346683fa52eb2210d29a75718982663c3314db1a338dc1b"} Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.258776 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a","Type":"ContainerStarted","Data":"7afaf7c61598643b94a48c59d16b20b8bf3e384fcfeb14ba143bb2efb6b52ce1"} Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.258836 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a","Type":"ContainerStarted","Data":"e9cfe45dd7a95bb6f804960292a343ebb6fbca53e9b53a3ae66c43e622bb8524"} Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.263067 4874 generic.go:334] "Generic (PLEG): container finished" podID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerID="188b34a62a51836a004f8d9874de5d170e685e208e876608910817864609da21" exitCode=0 Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.263119 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fphsx" event={"ID":"ce11dbd9-2bd4-40a6-9735-2c3489511fc6","Type":"ContainerDied","Data":"188b34a62a51836a004f8d9874de5d170e685e208e876608910817864609da21"} Jan 26 00:09:46 crc kubenswrapper[4874]: I0126 00:09:46.284307 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.284288983 podStartE2EDuration="2.284288983s" podCreationTimestamp="2026-01-26 00:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:09:46.280464506 +0000 UTC m=+161.014571033" watchObservedRunningTime="2026-01-26 00:09:46.284288983 +0000 UTC m=+161.018395520" Jan 26 00:09:47 crc kubenswrapper[4874]: E0126 00:09:47.013689 4874 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podeae2c00e_68d9_4b41_8c53_4e0a5d4d846a.slice/crio-7afaf7c61598643b94a48c59d16b20b8bf3e384fcfeb14ba143bb2efb6b52ce1.scope\": RecentStats: unable to find data in memory cache]" Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.165949 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.739952 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.803085 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b87f168e-059d-49d7-ab80-ed09ce2333c8-kubelet-dir\") pod \"b87f168e-059d-49d7-ab80-ed09ce2333c8\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.803564 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b87f168e-059d-49d7-ab80-ed09ce2333c8-kube-api-access\") pod \"b87f168e-059d-49d7-ab80-ed09ce2333c8\" (UID: \"b87f168e-059d-49d7-ab80-ed09ce2333c8\") " Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.803822 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87f168e-059d-49d7-ab80-ed09ce2333c8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b87f168e-059d-49d7-ab80-ed09ce2333c8" (UID: "b87f168e-059d-49d7-ab80-ed09ce2333c8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.804102 4874 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b87f168e-059d-49d7-ab80-ed09ce2333c8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.813318 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87f168e-059d-49d7-ab80-ed09ce2333c8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b87f168e-059d-49d7-ab80-ed09ce2333c8" (UID: "b87f168e-059d-49d7-ab80-ed09ce2333c8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:09:47 crc kubenswrapper[4874]: I0126 00:09:47.906213 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b87f168e-059d-49d7-ab80-ed09ce2333c8-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:09:48 crc kubenswrapper[4874]: I0126 00:09:48.049225 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wp989" Jan 26 00:09:48 crc kubenswrapper[4874]: I0126 00:09:48.299350 4874 generic.go:334] "Generic (PLEG): container finished" podID="eae2c00e-68d9-4b41-8c53-4e0a5d4d846a" containerID="7afaf7c61598643b94a48c59d16b20b8bf3e384fcfeb14ba143bb2efb6b52ce1" exitCode=0 Jan 26 00:09:48 crc kubenswrapper[4874]: I0126 00:09:48.299443 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a","Type":"ContainerDied","Data":"7afaf7c61598643b94a48c59d16b20b8bf3e384fcfeb14ba143bb2efb6b52ce1"} Jan 26 00:09:48 crc kubenswrapper[4874]: I0126 00:09:48.302739 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b87f168e-059d-49d7-ab80-ed09ce2333c8","Type":"ContainerDied","Data":"3dcad6e05270a0b4adf3ef649288125cbef4eb4b0ba87f023727d8d06876e56b"} Jan 26 00:09:48 crc kubenswrapper[4874]: I0126 00:09:48.302790 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dcad6e05270a0b4adf3ef649288125cbef4eb4b0ba87f023727d8d06876e56b" Jan 26 00:09:48 crc kubenswrapper[4874]: I0126 00:09:48.302826 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 26 00:09:49 crc kubenswrapper[4874]: I0126 00:09:49.531224 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:49 crc kubenswrapper[4874]: I0126 00:09:49.628551 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kube-api-access\") pod \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " Jan 26 00:09:49 crc kubenswrapper[4874]: I0126 00:09:49.628698 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kubelet-dir\") pod \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\" (UID: \"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a\") " Jan 26 00:09:49 crc kubenswrapper[4874]: I0126 00:09:49.629016 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eae2c00e-68d9-4b41-8c53-4e0a5d4d846a" (UID: "eae2c00e-68d9-4b41-8c53-4e0a5d4d846a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:09:49 crc kubenswrapper[4874]: I0126 00:09:49.633244 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eae2c00e-68d9-4b41-8c53-4e0a5d4d846a" (UID: "eae2c00e-68d9-4b41-8c53-4e0a5d4d846a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:09:49 crc kubenswrapper[4874]: I0126 00:09:49.729970 4874 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:09:49 crc kubenswrapper[4874]: I0126 00:09:49.730002 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae2c00e-68d9-4b41-8c53-4e0a5d4d846a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:09:50 crc kubenswrapper[4874]: I0126 00:09:50.326208 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eae2c00e-68d9-4b41-8c53-4e0a5d4d846a","Type":"ContainerDied","Data":"e9cfe45dd7a95bb6f804960292a343ebb6fbca53e9b53a3ae66c43e622bb8524"} Jan 26 00:09:50 crc kubenswrapper[4874]: I0126 00:09:50.326248 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9cfe45dd7a95bb6f804960292a343ebb6fbca53e9b53a3ae66c43e622bb8524" Jan 26 00:09:50 crc kubenswrapper[4874]: I0126 00:09:50.326305 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 26 00:09:52 crc kubenswrapper[4874]: I0126 00:09:52.265481 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:52 crc kubenswrapper[4874]: I0126 00:09:52.271500 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/489242ac-41c4-45f2-9828-58c4012e9f64-metrics-certs\") pod \"network-metrics-daemon-qkq5t\" (UID: \"489242ac-41c4-45f2-9828-58c4012e9f64\") " pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:52 crc kubenswrapper[4874]: I0126 00:09:52.444625 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:09:52 crc kubenswrapper[4874]: I0126 00:09:52.444718 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:09:52 crc kubenswrapper[4874]: I0126 00:09:52.539311 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkq5t" Jan 26 00:09:52 crc kubenswrapper[4874]: I0126 00:09:52.904786 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:52 crc kubenswrapper[4874]: I0126 00:09:52.908824 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-s6njc" Jan 26 00:09:58 crc kubenswrapper[4874]: I0126 00:09:58.278545 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:10:02 crc kubenswrapper[4874]: I0126 00:10:02.378177 4874 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-m97d7 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 26 00:10:02 crc kubenswrapper[4874]: I0126 00:10:02.378783 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-m97d7" podUID="4d5e251e-34cd-4620-90fc-0683fa3ed2ef" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 26 00:10:10 crc kubenswrapper[4874]: I0126 00:10:10.650381 4874 generic.go:334] "Generic (PLEG): container finished" podID="1413e57b-4a4d-4ae9-902f-a1e5fc080588" containerID="a08638902ea697ab404040f515d51807d63becbd73aece700d6ca8415fe34037" exitCode=0 Jan 26 00:10:10 crc kubenswrapper[4874]: I0126 00:10:10.650515 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29489760-gbhtf" event={"ID":"1413e57b-4a4d-4ae9-902f-a1e5fc080588","Type":"ContainerDied","Data":"a08638902ea697ab404040f515d51807d63becbd73aece700d6ca8415fe34037"} Jan 26 00:10:11 crc kubenswrapper[4874]: I0126 00:10:11.683468 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 26 00:10:13 crc kubenswrapper[4874]: I0126 00:10:13.010628 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xvzj9" Jan 26 00:10:15 crc kubenswrapper[4874]: E0126 00:10:15.859882 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2327204264/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 26 00:10:15 crc kubenswrapper[4874]: E0126 00:10:15.860626 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2mq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m7zdk_openshift-marketplace(b7b05ff9-4558-42dc-9ca7-64d2342a002c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2327204264/2\": happened during read: context canceled" logger="UnhandledError" Jan 26 00:10:15 crc kubenswrapper[4874]: E0126 00:10:15.862187 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage2327204264/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-m7zdk" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" Jan 26 00:10:15 crc kubenswrapper[4874]: I0126 00:10:15.869871 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:10:15 crc kubenswrapper[4874]: I0126 00:10:15.961060 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1413e57b-4a4d-4ae9-902f-a1e5fc080588-serviceca\") pod \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " Jan 26 00:10:15 crc kubenswrapper[4874]: I0126 00:10:15.961194 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mkq\" (UniqueName: \"kubernetes.io/projected/1413e57b-4a4d-4ae9-902f-a1e5fc080588-kube-api-access-x5mkq\") pod \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\" (UID: \"1413e57b-4a4d-4ae9-902f-a1e5fc080588\") " Jan 26 00:10:15 crc kubenswrapper[4874]: I0126 00:10:15.962045 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1413e57b-4a4d-4ae9-902f-a1e5fc080588-serviceca" (OuterVolumeSpecName: "serviceca") pod "1413e57b-4a4d-4ae9-902f-a1e5fc080588" (UID: "1413e57b-4a4d-4ae9-902f-a1e5fc080588"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:10:15 crc kubenswrapper[4874]: I0126 00:10:15.967379 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1413e57b-4a4d-4ae9-902f-a1e5fc080588-kube-api-access-x5mkq" (OuterVolumeSpecName: "kube-api-access-x5mkq") pod "1413e57b-4a4d-4ae9-902f-a1e5fc080588" (UID: "1413e57b-4a4d-4ae9-902f-a1e5fc080588"). InnerVolumeSpecName "kube-api-access-x5mkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:10:16 crc kubenswrapper[4874]: I0126 00:10:16.063218 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mkq\" (UniqueName: \"kubernetes.io/projected/1413e57b-4a4d-4ae9-902f-a1e5fc080588-kube-api-access-x5mkq\") on node \"crc\" DevicePath \"\"" Jan 26 00:10:16 crc kubenswrapper[4874]: I0126 00:10:16.063263 4874 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1413e57b-4a4d-4ae9-902f-a1e5fc080588-serviceca\") on node \"crc\" DevicePath \"\"" Jan 26 00:10:16 crc kubenswrapper[4874]: I0126 00:10:16.693206 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29489760-gbhtf" Jan 26 00:10:16 crc kubenswrapper[4874]: I0126 00:10:16.693249 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29489760-gbhtf" event={"ID":"1413e57b-4a4d-4ae9-902f-a1e5fc080588","Type":"ContainerDied","Data":"42fa368b26bb13eecf29705bd3953d3261055ec55a082a0258fab4fd3a2f7088"} Jan 26 00:10:16 crc kubenswrapper[4874]: I0126 00:10:16.693295 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42fa368b26bb13eecf29705bd3953d3261055ec55a082a0258fab4fd3a2f7088" Jan 26 00:10:18 crc kubenswrapper[4874]: E0126 00:10:18.113643 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m7zdk" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" Jan 26 00:10:18 crc kubenswrapper[4874]: E0126 00:10:18.172230 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 26 00:10:18 crc kubenswrapper[4874]: E0126 00:10:18.172422 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gcfx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r4fgs_openshift-marketplace(87afa5b5-ce89-41ab-8fe6-cb82acb882aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:10:18 crc kubenswrapper[4874]: E0126 00:10:18.173544 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r4fgs" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.450932 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 26 00:10:19 crc kubenswrapper[4874]: E0126 00:10:19.452219 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1413e57b-4a4d-4ae9-902f-a1e5fc080588" containerName="image-pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.452238 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="1413e57b-4a4d-4ae9-902f-a1e5fc080588" containerName="image-pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: E0126 00:10:19.452256 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87f168e-059d-49d7-ab80-ed09ce2333c8" containerName="pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.452268 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87f168e-059d-49d7-ab80-ed09ce2333c8" containerName="pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: E0126 00:10:19.452282 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae2c00e-68d9-4b41-8c53-4e0a5d4d846a" containerName="pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.452289 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae2c00e-68d9-4b41-8c53-4e0a5d4d846a" containerName="pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.452398 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87f168e-059d-49d7-ab80-ed09ce2333c8" containerName="pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.452407 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="1413e57b-4a4d-4ae9-902f-a1e5fc080588" containerName="image-pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.452419 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae2c00e-68d9-4b41-8c53-4e0a5d4d846a" containerName="pruner" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.452975 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.454380 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.455339 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.457083 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 26 00:10:19 crc kubenswrapper[4874]: E0126 00:10:19.529392 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r4fgs" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.532333 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bb2d946-2cbf-4d14-9e52-385aa8235506-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.532726 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bb2d946-2cbf-4d14-9e52-385aa8235506-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:19 crc kubenswrapper[4874]: E0126 00:10:19.606542 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 26 00:10:19 crc kubenswrapper[4874]: E0126 00:10:19.606754 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56c6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-s2dns_openshift-marketplace(3a28b4a4-b5c7-4f8f-9577-a65d9148efb5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:10:19 crc kubenswrapper[4874]: E0126 00:10:19.609464 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-s2dns" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.633834 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bb2d946-2cbf-4d14-9e52-385aa8235506-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.633879 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bb2d946-2cbf-4d14-9e52-385aa8235506-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.634319 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bb2d946-2cbf-4d14-9e52-385aa8235506-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.657809 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bb2d946-2cbf-4d14-9e52-385aa8235506-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:19 crc kubenswrapper[4874]: I0126 00:10:19.771837 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:20 crc kubenswrapper[4874]: E0126 00:10:20.970797 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-s2dns" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.059182 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.059358 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mtwgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-llwxh_openshift-marketplace(ff351744-6cfe-455c-851d-c8e218efee0e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.060726 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-llwxh" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.069376 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.069755 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k8jmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ps2v4_openshift-marketplace(259aa26b-333f-4b21-8bae-8494dbc94b50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.070918 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ps2v4" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.093387 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.093579 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bqdpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qx779_openshift-marketplace(f11f67a6-370d-4591-bebd-152ad8e10326): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:10:21 crc kubenswrapper[4874]: E0126 00:10:21.094761 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qx779" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" Jan 26 00:10:21 crc kubenswrapper[4874]: I0126 00:10:21.228713 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qkq5t"] Jan 26 00:10:22 crc kubenswrapper[4874]: I0126 00:10:22.443999 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:10:22 crc kubenswrapper[4874]: I0126 00:10:22.444295 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.037537 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.038362 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.077517 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.100541 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbefa690-0d0b-4c8e-8eda-ea618b034982-kube-api-access\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.100663 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.100691 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-var-lock\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: E0126 00:10:25.170555 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qx779" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" Jan 26 00:10:25 crc kubenswrapper[4874]: E0126 00:10:25.170908 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ps2v4" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" Jan 26 00:10:25 crc kubenswrapper[4874]: E0126 00:10:25.171015 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-llwxh" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" Jan 26 00:10:25 crc kubenswrapper[4874]: E0126 00:10:25.201080 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 26 00:10:25 crc kubenswrapper[4874]: E0126 00:10:25.201226 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xxxv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bdrl5_openshift-marketplace(bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.201366 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.201411 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-var-lock\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.201465 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbefa690-0d0b-4c8e-8eda-ea618b034982-kube-api-access\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.201489 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.201568 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-var-lock\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: E0126 00:10:25.202469 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bdrl5" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.222014 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbefa690-0d0b-4c8e-8eda-ea618b034982-kube-api-access\") pod \"installer-9-crc\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.403879 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.674511 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 26 00:10:25 crc kubenswrapper[4874]: W0126 00:10:25.684222 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4bb2d946_2cbf_4d14_9e52_385aa8235506.slice/crio-06dfbe1c1b2a0ca55a71ba81bf036f4b686bf682a27a5a4309e26a9efa814299 WatchSource:0}: Error finding container 06dfbe1c1b2a0ca55a71ba81bf036f4b686bf682a27a5a4309e26a9efa814299: Status 404 returned error can't find the container with id 06dfbe1c1b2a0ca55a71ba81bf036f4b686bf682a27a5a4309e26a9efa814299 Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.755708 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" event={"ID":"489242ac-41c4-45f2-9828-58c4012e9f64","Type":"ContainerStarted","Data":"a78b10f0b3027e726e9a455ff66fdaa13cd26df902cf61f9cc861c7eaddd5284"} Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.757324 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4bb2d946-2cbf-4d14-9e52-385aa8235506","Type":"ContainerStarted","Data":"06dfbe1c1b2a0ca55a71ba81bf036f4b686bf682a27a5a4309e26a9efa814299"} Jan 26 00:10:25 crc kubenswrapper[4874]: E0126 00:10:25.761109 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bdrl5" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" Jan 26 00:10:25 crc kubenswrapper[4874]: I0126 00:10:25.825532 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 26 00:10:27 crc kubenswrapper[4874]: I0126 00:10:27.094202 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbefa690-0d0b-4c8e-8eda-ea618b034982","Type":"ContainerStarted","Data":"d9a86f684f9a70b9f92e155ca4b39422325aff50e91a7f48fc7d254cd7e398f3"} Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.101169 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbefa690-0d0b-4c8e-8eda-ea618b034982","Type":"ContainerStarted","Data":"abf9585f8dec20b27d15be216deba030eff85ec641c89e9198731a891fe76156"} Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.103797 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" event={"ID":"489242ac-41c4-45f2-9828-58c4012e9f64","Type":"ContainerStarted","Data":"ce72daad3da7c41122923e8ff19347812ce91ed62b7421cfe7950ccba7f78195"} Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.103830 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qkq5t" event={"ID":"489242ac-41c4-45f2-9828-58c4012e9f64","Type":"ContainerStarted","Data":"a454b0ff458310d7d90301f32833e855496450654dda5651ca79f14eb547124d"} Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.105748 4874 generic.go:334] "Generic (PLEG): container finished" podID="4bb2d946-2cbf-4d14-9e52-385aa8235506" containerID="cd38f0029639ebd12fe2e8573f942019770c65f4d828c7f85736537161039e70" exitCode=0 Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.105873 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4bb2d946-2cbf-4d14-9e52-385aa8235506","Type":"ContainerDied","Data":"cd38f0029639ebd12fe2e8573f942019770c65f4d828c7f85736537161039e70"} Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.108404 4874 generic.go:334] "Generic (PLEG): container finished" podID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerID="a94fb49a36c98335b9565a257f7422a96d0bd6331629565e3b36470cd9be7d83" exitCode=0 Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.108434 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fphsx" event={"ID":"ce11dbd9-2bd4-40a6-9735-2c3489511fc6","Type":"ContainerDied","Data":"a94fb49a36c98335b9565a257f7422a96d0bd6331629565e3b36470cd9be7d83"} Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.120794 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.120773635 podStartE2EDuration="3.120773635s" podCreationTimestamp="2026-01-26 00:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:10:28.118246987 +0000 UTC m=+202.852353524" watchObservedRunningTime="2026-01-26 00:10:28.120773635 +0000 UTC m=+202.854880172" Jan 26 00:10:28 crc kubenswrapper[4874]: I0126 00:10:28.141554 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qkq5t" podStartSLOduration=179.141536879 podStartE2EDuration="2m59.141536879s" podCreationTimestamp="2026-01-26 00:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:10:28.136801214 +0000 UTC m=+202.870907751" watchObservedRunningTime="2026-01-26 00:10:28.141536879 +0000 UTC m=+202.875643416" Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.121128 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fphsx" event={"ID":"ce11dbd9-2bd4-40a6-9735-2c3489511fc6","Type":"ContainerStarted","Data":"1ee77a32afdd7748f0727dc09f2fb6f98510c2ca42ec747b1f8d8b9524f2e2b1"} Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.343748 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.421087 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bb2d946-2cbf-4d14-9e52-385aa8235506-kube-api-access\") pod \"4bb2d946-2cbf-4d14-9e52-385aa8235506\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.421159 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bb2d946-2cbf-4d14-9e52-385aa8235506-kubelet-dir\") pod \"4bb2d946-2cbf-4d14-9e52-385aa8235506\" (UID: \"4bb2d946-2cbf-4d14-9e52-385aa8235506\") " Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.421567 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bb2d946-2cbf-4d14-9e52-385aa8235506-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4bb2d946-2cbf-4d14-9e52-385aa8235506" (UID: "4bb2d946-2cbf-4d14-9e52-385aa8235506"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.427278 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb2d946-2cbf-4d14-9e52-385aa8235506-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4bb2d946-2cbf-4d14-9e52-385aa8235506" (UID: "4bb2d946-2cbf-4d14-9e52-385aa8235506"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.522692 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bb2d946-2cbf-4d14-9e52-385aa8235506-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:10:29 crc kubenswrapper[4874]: I0126 00:10:29.522725 4874 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bb2d946-2cbf-4d14-9e52-385aa8235506-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:10:30 crc kubenswrapper[4874]: I0126 00:10:30.126502 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4bb2d946-2cbf-4d14-9e52-385aa8235506","Type":"ContainerDied","Data":"06dfbe1c1b2a0ca55a71ba81bf036f4b686bf682a27a5a4309e26a9efa814299"} Jan 26 00:10:30 crc kubenswrapper[4874]: I0126 00:10:30.126541 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06dfbe1c1b2a0ca55a71ba81bf036f4b686bf682a27a5a4309e26a9efa814299" Jan 26 00:10:30 crc kubenswrapper[4874]: I0126 00:10:30.126592 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 26 00:10:30 crc kubenswrapper[4874]: I0126 00:10:30.915779 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8k592"] Jan 26 00:10:31 crc kubenswrapper[4874]: I0126 00:10:31.644753 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fphsx" podStartSLOduration=6.1151491 podStartE2EDuration="48.644726232s" podCreationTimestamp="2026-01-26 00:09:43 +0000 UTC" firstStartedPulling="2026-01-26 00:09:46.264746069 +0000 UTC m=+160.998852606" lastFinishedPulling="2026-01-26 00:10:28.794323211 +0000 UTC m=+203.528429738" observedRunningTime="2026-01-26 00:10:31.154131681 +0000 UTC m=+205.888238238" watchObservedRunningTime="2026-01-26 00:10:31.644726232 +0000 UTC m=+206.378832769" Jan 26 00:10:34 crc kubenswrapper[4874]: I0126 00:10:34.300758 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:10:34 crc kubenswrapper[4874]: I0126 00:10:34.301839 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:10:35 crc kubenswrapper[4874]: I0126 00:10:35.407249 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fphsx" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="registry-server" probeResult="failure" output=< Jan 26 00:10:35 crc kubenswrapper[4874]: timeout: failed to connect service ":50051" within 1s Jan 26 00:10:35 crc kubenswrapper[4874]: > Jan 26 00:10:44 crc kubenswrapper[4874]: I0126 00:10:44.361516 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:10:44 crc kubenswrapper[4874]: I0126 00:10:44.406498 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:10:44 crc kubenswrapper[4874]: I0126 00:10:44.854436 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fphsx"] Jan 26 00:10:46 crc kubenswrapper[4874]: I0126 00:10:46.227198 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx779" event={"ID":"f11f67a6-370d-4591-bebd-152ad8e10326","Type":"ContainerStarted","Data":"b503b86aa94507f76586e00396d696e2c0418beeba252d0af8eacad174cf898d"} Jan 26 00:10:46 crc kubenswrapper[4874]: I0126 00:10:46.230631 4874 generic.go:334] "Generic (PLEG): container finished" podID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerID="5014dbffb268b6b8b3aa50f0d50238d316b0ae689a8b38591cee56e964859b8e" exitCode=0 Jan 26 00:10:46 crc kubenswrapper[4874]: I0126 00:10:46.230688 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fgs" event={"ID":"87afa5b5-ce89-41ab-8fe6-cb82acb882aa","Type":"ContainerDied","Data":"5014dbffb268b6b8b3aa50f0d50238d316b0ae689a8b38591cee56e964859b8e"} Jan 26 00:10:46 crc kubenswrapper[4874]: I0126 00:10:46.232478 4874 generic.go:334] "Generic (PLEG): container finished" podID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerID="27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc" exitCode=0 Jan 26 00:10:46 crc kubenswrapper[4874]: I0126 00:10:46.232572 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7zdk" event={"ID":"b7b05ff9-4558-42dc-9ca7-64d2342a002c","Type":"ContainerDied","Data":"27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc"} Jan 26 00:10:46 crc kubenswrapper[4874]: I0126 00:10:46.232630 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fphsx" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="registry-server" containerID="cri-o://1ee77a32afdd7748f0727dc09f2fb6f98510c2ca42ec747b1f8d8b9524f2e2b1" gracePeriod=2 Jan 26 00:10:47 crc kubenswrapper[4874]: I0126 00:10:47.248101 4874 generic.go:334] "Generic (PLEG): container finished" podID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerID="1ee77a32afdd7748f0727dc09f2fb6f98510c2ca42ec747b1f8d8b9524f2e2b1" exitCode=0 Jan 26 00:10:47 crc kubenswrapper[4874]: I0126 00:10:47.248202 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fphsx" event={"ID":"ce11dbd9-2bd4-40a6-9735-2c3489511fc6","Type":"ContainerDied","Data":"1ee77a32afdd7748f0727dc09f2fb6f98510c2ca42ec747b1f8d8b9524f2e2b1"} Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.692440 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.722518 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nbnp\" (UniqueName: \"kubernetes.io/projected/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-kube-api-access-5nbnp\") pod \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.722612 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-catalog-content\") pod \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.728280 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-kube-api-access-5nbnp" (OuterVolumeSpecName: "kube-api-access-5nbnp") pod "ce11dbd9-2bd4-40a6-9735-2c3489511fc6" (UID: "ce11dbd9-2bd4-40a6-9735-2c3489511fc6"). InnerVolumeSpecName "kube-api-access-5nbnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.745490 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-utilities\") pod \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\" (UID: \"ce11dbd9-2bd4-40a6-9735-2c3489511fc6\") " Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.746212 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-utilities" (OuterVolumeSpecName: "utilities") pod "ce11dbd9-2bd4-40a6-9735-2c3489511fc6" (UID: "ce11dbd9-2bd4-40a6-9735-2c3489511fc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.746546 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nbnp\" (UniqueName: \"kubernetes.io/projected/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-kube-api-access-5nbnp\") on node \"crc\" DevicePath \"\"" Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.746578 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.866116 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce11dbd9-2bd4-40a6-9735-2c3489511fc6" (UID: "ce11dbd9-2bd4-40a6-9735-2c3489511fc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:10:48 crc kubenswrapper[4874]: I0126 00:10:48.949385 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce11dbd9-2bd4-40a6-9735-2c3489511fc6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.260400 4874 generic.go:334] "Generic (PLEG): container finished" podID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerID="09264b002ad2506a6c12a204398f5ae5f95f6657e88d8e1839b41f1f0378b145" exitCode=0 Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.260480 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2dns" event={"ID":"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5","Type":"ContainerDied","Data":"09264b002ad2506a6c12a204398f5ae5f95f6657e88d8e1839b41f1f0378b145"} Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.262792 4874 generic.go:334] "Generic (PLEG): container finished" podID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerID="1fc982f7b99f1749518ced00e81090470938a5623777f0bd068e8670797dbd56" exitCode=0 Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.262866 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ps2v4" event={"ID":"259aa26b-333f-4b21-8bae-8494dbc94b50","Type":"ContainerDied","Data":"1fc982f7b99f1749518ced00e81090470938a5623777f0bd068e8670797dbd56"} Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.266085 4874 generic.go:334] "Generic (PLEG): container finished" podID="f11f67a6-370d-4591-bebd-152ad8e10326" containerID="b503b86aa94507f76586e00396d696e2c0418beeba252d0af8eacad174cf898d" exitCode=0 Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.266186 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx779" event={"ID":"f11f67a6-370d-4591-bebd-152ad8e10326","Type":"ContainerDied","Data":"b503b86aa94507f76586e00396d696e2c0418beeba252d0af8eacad174cf898d"} Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.269189 4874 generic.go:334] "Generic (PLEG): container finished" podID="ff351744-6cfe-455c-851d-c8e218efee0e" containerID="b667c475e509ba98231fd329931e8726809210a69378495d72137dcb9528060e" exitCode=0 Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.269259 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llwxh" event={"ID":"ff351744-6cfe-455c-851d-c8e218efee0e","Type":"ContainerDied","Data":"b667c475e509ba98231fd329931e8726809210a69378495d72137dcb9528060e"} Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.277494 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdrl5" event={"ID":"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7","Type":"ContainerStarted","Data":"8e3c81dea0ab72c723aa3cb0dc15a2079a5fb81a547f949caa9e4d7d0350a13d"} Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.282820 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fphsx" event={"ID":"ce11dbd9-2bd4-40a6-9735-2c3489511fc6","Type":"ContainerDied","Data":"48b66f5e52ae25457aa71ebe3a0a836d0ed85a066bd26ad1a5bb0c0e5ff8b1d7"} Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.282864 4874 scope.go:117] "RemoveContainer" containerID="1ee77a32afdd7748f0727dc09f2fb6f98510c2ca42ec747b1f8d8b9524f2e2b1" Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.283001 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fphsx" Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.327713 4874 scope.go:117] "RemoveContainer" containerID="a94fb49a36c98335b9565a257f7422a96d0bd6331629565e3b36470cd9be7d83" Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.396609 4874 scope.go:117] "RemoveContainer" containerID="188b34a62a51836a004f8d9874de5d170e685e208e876608910817864609da21" Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.408660 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fphsx"] Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.411241 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fphsx"] Jan 26 00:10:49 crc kubenswrapper[4874]: I0126 00:10:49.628069 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" path="/var/lib/kubelet/pods/ce11dbd9-2bd4-40a6-9735-2c3489511fc6/volumes" Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.346561 4874 generic.go:334] "Generic (PLEG): container finished" podID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerID="8e3c81dea0ab72c723aa3cb0dc15a2079a5fb81a547f949caa9e4d7d0350a13d" exitCode=0 Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.346644 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdrl5" event={"ID":"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7","Type":"ContainerDied","Data":"8e3c81dea0ab72c723aa3cb0dc15a2079a5fb81a547f949caa9e4d7d0350a13d"} Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.351261 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fgs" event={"ID":"87afa5b5-ce89-41ab-8fe6-cb82acb882aa","Type":"ContainerStarted","Data":"e43e8b14ab04e71647f442ce65934eb39bce4ce4b23631584ac981674894af72"} Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.354525 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7zdk" event={"ID":"b7b05ff9-4558-42dc-9ca7-64d2342a002c","Type":"ContainerStarted","Data":"b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8"} Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.392760 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m7zdk" podStartSLOduration=3.448735498 podStartE2EDuration="1m8.392710954s" podCreationTimestamp="2026-01-26 00:09:42 +0000 UTC" firstStartedPulling="2026-01-26 00:09:44.218088214 +0000 UTC m=+158.952194751" lastFinishedPulling="2026-01-26 00:10:49.16206367 +0000 UTC m=+223.896170207" observedRunningTime="2026-01-26 00:10:50.38914692 +0000 UTC m=+225.123253467" watchObservedRunningTime="2026-01-26 00:10:50.392710954 +0000 UTC m=+225.126817511" Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.411361 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r4fgs" podStartSLOduration=3.286032449 podStartE2EDuration="1m10.411338833s" podCreationTimestamp="2026-01-26 00:09:40 +0000 UTC" firstStartedPulling="2026-01-26 00:09:42.146090175 +0000 UTC m=+156.880196712" lastFinishedPulling="2026-01-26 00:10:49.271396549 +0000 UTC m=+224.005503096" observedRunningTime="2026-01-26 00:10:50.410533525 +0000 UTC m=+225.144640072" watchObservedRunningTime="2026-01-26 00:10:50.411338833 +0000 UTC m=+225.145445370" Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.840326 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:10:50 crc kubenswrapper[4874]: I0126 00:10:50.840383 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:10:51 crc kubenswrapper[4874]: I0126 00:10:51.376481 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx779" event={"ID":"f11f67a6-370d-4591-bebd-152ad8e10326","Type":"ContainerStarted","Data":"62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b"} Jan 26 00:10:51 crc kubenswrapper[4874]: I0126 00:10:51.395512 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qx779" podStartSLOduration=2.378016907 podStartE2EDuration="1m11.395492368s" podCreationTimestamp="2026-01-26 00:09:40 +0000 UTC" firstStartedPulling="2026-01-26 00:09:42.140005625 +0000 UTC m=+156.874112162" lastFinishedPulling="2026-01-26 00:10:51.157481086 +0000 UTC m=+225.891587623" observedRunningTime="2026-01-26 00:10:51.393005671 +0000 UTC m=+226.127112228" watchObservedRunningTime="2026-01-26 00:10:51.395492368 +0000 UTC m=+226.129598905" Jan 26 00:10:51 crc kubenswrapper[4874]: I0126 00:10:51.876819 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-r4fgs" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="registry-server" probeResult="failure" output=< Jan 26 00:10:51 crc kubenswrapper[4874]: timeout: failed to connect service ":50051" within 1s Jan 26 00:10:51 crc kubenswrapper[4874]: > Jan 26 00:10:52 crc kubenswrapper[4874]: I0126 00:10:52.467150 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llwxh" event={"ID":"ff351744-6cfe-455c-851d-c8e218efee0e","Type":"ContainerStarted","Data":"d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca"} Jan 26 00:10:52 crc kubenswrapper[4874]: I0126 00:10:52.484840 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-llwxh" podStartSLOduration=3.299235238 podStartE2EDuration="1m12.484825618s" podCreationTimestamp="2026-01-26 00:09:40 +0000 UTC" firstStartedPulling="2026-01-26 00:09:42.143215375 +0000 UTC m=+156.877321912" lastFinishedPulling="2026-01-26 00:10:51.328805745 +0000 UTC m=+226.062912292" observedRunningTime="2026-01-26 00:10:52.483826594 +0000 UTC m=+227.217933141" watchObservedRunningTime="2026-01-26 00:10:52.484825618 +0000 UTC m=+227.218932155" Jan 26 00:10:52 crc kubenswrapper[4874]: I0126 00:10:52.489883 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:10:52 crc kubenswrapper[4874]: I0126 00:10:52.489927 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:10:52 crc kubenswrapper[4874]: I0126 00:10:52.489977 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:10:52 crc kubenswrapper[4874]: I0126 00:10:52.490436 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:10:52 crc kubenswrapper[4874]: I0126 00:10:52.490554 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b" gracePeriod=600 Jan 26 00:10:53 crc kubenswrapper[4874]: I0126 00:10:53.031559 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:10:53 crc kubenswrapper[4874]: I0126 00:10:53.031637 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:10:53 crc kubenswrapper[4874]: I0126 00:10:53.090288 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:10:54 crc kubenswrapper[4874]: I0126 00:10:54.478338 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdrl5" event={"ID":"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7","Type":"ContainerStarted","Data":"763afb54b3bfddcbb7c1525a40752f55111a0622eeb9d7d7b188f8313b70a4cf"} Jan 26 00:10:54 crc kubenswrapper[4874]: I0126 00:10:54.480713 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b" exitCode=0 Jan 26 00:10:54 crc kubenswrapper[4874]: I0126 00:10:54.480809 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b"} Jan 26 00:10:54 crc kubenswrapper[4874]: I0126 00:10:54.483250 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2dns" event={"ID":"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5","Type":"ContainerStarted","Data":"1e8560cdbf43be5cf7a9c86479cab3ac6acaeb6f2cab6a337f3c89feeb33e591"} Jan 26 00:10:54 crc kubenswrapper[4874]: I0126 00:10:54.485668 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ps2v4" event={"ID":"259aa26b-333f-4b21-8bae-8494dbc94b50","Type":"ContainerStarted","Data":"1bb5f4804d8eb0fc30a130467d8bdad5af0a68675807b7ec93cb68d904be4306"} Jan 26 00:10:54 crc kubenswrapper[4874]: I0126 00:10:54.511923 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s2dns" podStartSLOduration=2.888300762 podStartE2EDuration="1m12.511897897s" podCreationTimestamp="2026-01-26 00:09:42 +0000 UTC" firstStartedPulling="2026-01-26 00:09:44.198234442 +0000 UTC m=+158.932340979" lastFinishedPulling="2026-01-26 00:10:53.821831577 +0000 UTC m=+228.555938114" observedRunningTime="2026-01-26 00:10:54.51027644 +0000 UTC m=+229.244382977" watchObservedRunningTime="2026-01-26 00:10:54.511897897 +0000 UTC m=+229.246004474" Jan 26 00:10:54 crc kubenswrapper[4874]: I0126 00:10:54.527236 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ps2v4" podStartSLOduration=5.158860647 podStartE2EDuration="1m14.527214621s" podCreationTimestamp="2026-01-26 00:09:40 +0000 UTC" firstStartedPulling="2026-01-26 00:09:42.140047417 +0000 UTC m=+156.874153964" lastFinishedPulling="2026-01-26 00:10:51.508401401 +0000 UTC m=+226.242507938" observedRunningTime="2026-01-26 00:10:54.524641211 +0000 UTC m=+229.258747778" watchObservedRunningTime="2026-01-26 00:10:54.527214621 +0000 UTC m=+229.261321168" Jan 26 00:10:55 crc kubenswrapper[4874]: I0126 00:10:55.956802 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" containerName="oauth-openshift" containerID="cri-o://8a60f1ef8836ef1b6b065e9b7b3a29c41522c2c6792c36df10d2d8138eff9947" gracePeriod=15 Jan 26 00:10:59 crc kubenswrapper[4874]: I0126 00:10:59.544381 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bdrl5" podStartSLOduration=8.819138977 podStartE2EDuration="1m16.544356058s" podCreationTimestamp="2026-01-26 00:09:43 +0000 UTC" firstStartedPulling="2026-01-26 00:09:46.239754683 +0000 UTC m=+160.973861220" lastFinishedPulling="2026-01-26 00:10:53.964971704 +0000 UTC m=+228.699078301" observedRunningTime="2026-01-26 00:10:59.543413935 +0000 UTC m=+234.277520472" watchObservedRunningTime="2026-01-26 00:10:59.544356058 +0000 UTC m=+234.278462635" Jan 26 00:11:00 crc kubenswrapper[4874]: I0126 00:11:00.635198 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:11:00 crc kubenswrapper[4874]: I0126 00:11:00.635266 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:11:00 crc kubenswrapper[4874]: I0126 00:11:00.680725 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:11:00 crc kubenswrapper[4874]: I0126 00:11:00.883010 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:11:00 crc kubenswrapper[4874]: I0126 00:11:00.919249 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:11:01 crc kubenswrapper[4874]: I0126 00:11:01.051837 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:11:01 crc kubenswrapper[4874]: I0126 00:11:01.051906 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:11:01 crc kubenswrapper[4874]: I0126 00:11:01.120738 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:11:01 crc kubenswrapper[4874]: I0126 00:11:01.283229 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:11:01 crc kubenswrapper[4874]: I0126 00:11:01.283865 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:11:01 crc kubenswrapper[4874]: I0126 00:11:01.332275 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:11:02 crc kubenswrapper[4874]: I0126 00:11:02.637328 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:11:02 crc kubenswrapper[4874]: I0126 00:11:02.637854 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:11:02 crc kubenswrapper[4874]: I0126 00:11:02.705365 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:11:02 crc kubenswrapper[4874]: I0126 00:11:02.754105 4874 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8k592 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Jan 26 00:11:02 crc kubenswrapper[4874]: I0126 00:11:02.754185 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Jan 26 00:11:03 crc kubenswrapper[4874]: I0126 00:11:03.088064 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:11:03 crc kubenswrapper[4874]: I0126 00:11:03.870766 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:11:03 crc kubenswrapper[4874]: I0126 00:11:03.871215 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:11:03 crc kubenswrapper[4874]: I0126 00:11:03.939942 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.606527 4874 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.607055 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="extract-content" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.607135 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="extract-content" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.607207 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb2d946-2cbf-4d14-9e52-385aa8235506" containerName="pruner" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.607273 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb2d946-2cbf-4d14-9e52-385aa8235506" containerName="pruner" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.607344 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="extract-utilities" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.607416 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="extract-utilities" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.607489 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="registry-server" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.607579 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="registry-server" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.607758 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce11dbd9-2bd4-40a6-9735-2c3489511fc6" containerName="registry-server" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.607831 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb2d946-2cbf-4d14-9e52-385aa8235506" containerName="pruner" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.608252 4874 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.608405 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.608632 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1" gracePeriod=15 Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.608647 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811" gracePeriod=15 Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.608886 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050" gracePeriod=15 Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.608702 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766" gracePeriod=15 Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.608786 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da" gracePeriod=15 Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.631638 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.631696 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.631769 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.631800 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.631930 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.646386 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.677522 4874 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.677877 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.677904 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.677927 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.677940 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.677978 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.677992 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.678018 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678030 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.678046 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678059 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.678074 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678087 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678261 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678281 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678302 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678318 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678337 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678350 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 26 00:11:04 crc kubenswrapper[4874]: E0126 00:11:04.678509 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.678532 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.748912 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749246 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749393 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749476 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749124 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749510 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749643 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749680 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749722 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749762 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749849 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.749337 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.750177 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.850510 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.851046 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.851195 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.851272 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.850666 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.851091 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:04 crc kubenswrapper[4874]: I0126 00:11:04.943420 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:11:05 crc kubenswrapper[4874]: W0126 00:11:05.200841 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-386948f0791c7730a8dc72f5e14cfed04f7200ae619a8b12a988b849a8aadd52 WatchSource:0}: Error finding container 386948f0791c7730a8dc72f5e14cfed04f7200ae619a8b12a988b849a8aadd52: Status 404 returned error can't find the container with id 386948f0791c7730a8dc72f5e14cfed04f7200ae619a8b12a988b849a8aadd52 Jan 26 00:11:05 crc kubenswrapper[4874]: E0126 00:11:05.205212 4874 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e1f6efff2fd1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-26 00:11:05.204161823 +0000 UTC m=+239.938268360,LastTimestamp:2026-01-26 00:11:05.204161823 +0000 UTC m=+239.938268360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.355528 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-558db77b4-8k592_cabfc520-9523-45b8-8be1-57e71788c5a7/oauth-openshift/0.log" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.355626 4874 generic.go:334] "Generic (PLEG): container finished" podID="cabfc520-9523-45b8-8be1-57e71788c5a7" containerID="8a60f1ef8836ef1b6b065e9b7b3a29c41522c2c6792c36df10d2d8138eff9947" exitCode=-1 Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.355717 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" event={"ID":"cabfc520-9523-45b8-8be1-57e71788c5a7","Type":"ContainerDied","Data":"8a60f1ef8836ef1b6b065e9b7b3a29c41522c2c6792c36df10d2d8138eff9947"} Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.401482 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.402638 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.403276 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.403732 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.626297 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.626813 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:05 crc kubenswrapper[4874]: I0126 00:11:05.627121 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.366020 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"386948f0791c7730a8dc72f5e14cfed04f7200ae619a8b12a988b849a8aadd52"} Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.378818 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.381020 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.381902 4874 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766" exitCode=2 Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.436789 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.437671 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.438223 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.438457 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.441641 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.442111 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.442457 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.442730 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.443463 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.447793 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.448421 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.448649 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.448902 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.449241 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.449505 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.449757 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.450201 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.450501 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.450764 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.450992 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.451190 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:06 crc kubenswrapper[4874]: I0126 00:11:06.451393 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.694179 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.695580 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.695972 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.696215 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.696447 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.696671 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.696878 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.697138 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.840175 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-trusted-ca-bundle\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.840732 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-service-ca\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.840775 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j9sj\" (UniqueName: \"kubernetes.io/projected/cabfc520-9523-45b8-8be1-57e71788c5a7-kube-api-access-5j9sj\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.840805 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-session\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.840874 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-idp-0-file-data\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.840904 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-error\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841063 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-policies\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841092 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-router-certs\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841177 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-ocp-branding-template\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841202 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-cliconfig\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841232 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-serving-cert\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841254 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-dir\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841278 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-provider-selection\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.841325 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-login\") pod \"cabfc520-9523-45b8-8be1-57e71788c5a7\" (UID: \"cabfc520-9523-45b8-8be1-57e71788c5a7\") " Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.842071 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.842096 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.842190 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.842409 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.842452 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.849622 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabfc520-9523-45b8-8be1-57e71788c5a7-kube-api-access-5j9sj" (OuterVolumeSpecName: "kube-api-access-5j9sj") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "kube-api-access-5j9sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.852467 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.853014 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.853028 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.853465 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.853899 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.854265 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.854539 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.854634 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cabfc520-9523-45b8-8be1-57e71788c5a7" (UID: "cabfc520-9523-45b8-8be1-57e71788c5a7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.921546 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.923172 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.924386 4874 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811" exitCode=0 Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.924590 4874 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da" exitCode=0 Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.924731 4874 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050" exitCode=0 Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.924897 4874 scope.go:117] "RemoveContainer" containerID="7a8ccd415e62c663cf5fe2c668808e58cd52c9530b95941fb8cb010f84dcbf7f" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942806 4874 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942854 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942871 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942890 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942901 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942911 4874 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabfc520-9523-45b8-8be1-57e71788c5a7-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942921 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942936 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942945 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942971 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942981 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.942991 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j9sj\" (UniqueName: \"kubernetes.io/projected/cabfc520-9523-45b8-8be1-57e71788c5a7-kube-api-access-5j9sj\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.943000 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:09 crc kubenswrapper[4874]: I0126 00:11:09.943011 4874 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cabfc520-9523-45b8-8be1-57e71788c5a7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.286204 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.287746 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.288505 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.289704 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.290354 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.291366 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.292065 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.292416 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.292995 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.295046 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.448224 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.448311 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.448345 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.448756 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.448789 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.448929 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.550187 4874 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.550222 4874 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.550232 4874 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:10 crc kubenswrapper[4874]: E0126 00:11:10.911501 4874 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: E0126 00:11:10.912146 4874 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: E0126 00:11:10.912402 4874 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: E0126 00:11:10.912613 4874 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: E0126 00:11:10.912830 4874 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.912856 4874 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 26 00:11:10 crc kubenswrapper[4874]: E0126 00:11:10.913092 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="200ms" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.933741 4874 generic.go:334] "Generic (PLEG): container finished" podID="cbefa690-0d0b-4c8e-8eda-ea618b034982" containerID="abf9585f8dec20b27d15be216deba030eff85ec641c89e9198731a891fe76156" exitCode=0 Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.933873 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbefa690-0d0b-4c8e-8eda-ea618b034982","Type":"ContainerDied","Data":"abf9585f8dec20b27d15be216deba030eff85ec641c89e9198731a891fe76156"} Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.934789 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.935775 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.936429 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.936790 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.937234 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.937558 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.938005 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.938664 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.939040 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.939237 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.940624 4874 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1" exitCode=0 Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.940724 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.940806 4874 scope.go:117] "RemoveContainer" containerID="fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.944252 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" event={"ID":"cabfc520-9523-45b8-8be1-57e71788c5a7","Type":"ContainerDied","Data":"fb02bc9343a8b9df98138be076e434e5ad00b7af5cb06e63183894584a251b7a"} Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.944341 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.945184 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.945523 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.945879 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.946334 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.946829 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.947170 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.947289 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"e5c45472b82d54943922a48ae40bec07ff1c9ab6f870bd221c29915f338518d0"} Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.947601 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.947938 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.948369 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.948715 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.949028 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.950191 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.950945 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.951493 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.952056 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53"} Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.952073 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.952530 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.952764 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.953048 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.953352 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.953686 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.954045 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.956749 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.957062 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.957302 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.957514 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.957724 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.957930 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.958186 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.958418 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.968696 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.969126 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.969551 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.969702 4874 scope.go:117] "RemoveContainer" containerID="2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.970081 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.970422 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.970767 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.971085 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.971480 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.971880 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.972270 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.972697 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.973125 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.973502 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.973862 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.974254 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.974672 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.975183 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.975578 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.975910 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.976326 4874 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:10 crc kubenswrapper[4874]: I0126 00:11:10.987803 4874 scope.go:117] "RemoveContainer" containerID="3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.007553 4874 scope.go:117] "RemoveContainer" containerID="45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.023823 4874 scope.go:117] "RemoveContainer" containerID="67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.040864 4874 scope.go:117] "RemoveContainer" containerID="a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.070338 4874 scope.go:117] "RemoveContainer" containerID="fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.070945 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\": container with ID starting with fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811 not found: ID does not exist" containerID="fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.070995 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811"} err="failed to get container status \"fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\": rpc error: code = NotFound desc = could not find container \"fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811\": container with ID starting with fd1c43f40cd5ceceb4429acd45816f008573679124f11d9beb38ceb198d23811 not found: ID does not exist" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.071029 4874 scope.go:117] "RemoveContainer" containerID="2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.071528 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\": container with ID starting with 2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da not found: ID does not exist" containerID="2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.071637 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da"} err="failed to get container status \"2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\": rpc error: code = NotFound desc = could not find container \"2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da\": container with ID starting with 2eccdb86146cb314f7207755e8d7bc621a864caa2986ec9d0b614e53f32537da not found: ID does not exist" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.071667 4874 scope.go:117] "RemoveContainer" containerID="3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.072338 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\": container with ID starting with 3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050 not found: ID does not exist" containerID="3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.072387 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050"} err="failed to get container status \"3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\": rpc error: code = NotFound desc = could not find container \"3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050\": container with ID starting with 3c4028ad1d56235c8578074323dabfe0697a79d27d22d6ed4c951fccb0f1f050 not found: ID does not exist" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.072420 4874 scope.go:117] "RemoveContainer" containerID="45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.072725 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\": container with ID starting with 45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766 not found: ID does not exist" containerID="45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.072749 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766"} err="failed to get container status \"45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\": rpc error: code = NotFound desc = could not find container \"45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766\": container with ID starting with 45f3e1f59938dd09f78e85d484938b0d2e54539ceabf7bace3585de548bf0766 not found: ID does not exist" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.072764 4874 scope.go:117] "RemoveContainer" containerID="67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.073697 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\": container with ID starting with 67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1 not found: ID does not exist" containerID="67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.073750 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1"} err="failed to get container status \"67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\": rpc error: code = NotFound desc = could not find container \"67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1\": container with ID starting with 67feae257976cbff660d72e616218b58db82f62b2be01fb51247a8085b16a6e1 not found: ID does not exist" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.073768 4874 scope.go:117] "RemoveContainer" containerID="a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.074130 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\": container with ID starting with a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9 not found: ID does not exist" containerID="a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.074171 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9"} err="failed to get container status \"a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\": rpc error: code = NotFound desc = could not find container \"a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9\": container with ID starting with a97a3875dc833c7babc7213ed51a0a39343ca595fa2efedba7cc723709c2c7a9 not found: ID does not exist" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.074198 4874 scope.go:117] "RemoveContainer" containerID="8a60f1ef8836ef1b6b065e9b7b3a29c41522c2c6792c36df10d2d8138eff9947" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.113868 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="400ms" Jan 26 00:11:11 crc kubenswrapper[4874]: E0126 00:11:11.515444 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="800ms" Jan 26 00:11:11 crc kubenswrapper[4874]: I0126 00:11:11.629051 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 26 00:11:12 crc kubenswrapper[4874]: E0126 00:11:12.063992 4874 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e1f6efff2fd1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-26 00:11:05.204161823 +0000 UTC m=+239.938268360,LastTimestamp:2026-01-26 00:11:05.204161823 +0000 UTC m=+239.938268360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.289404 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.290556 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.290813 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.291065 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.291321 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.291606 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.291825 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.292203 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.292525 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.292796 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:12 crc kubenswrapper[4874]: E0126 00:11:12.316978 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="1.6s" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.383665 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-var-lock\") pod \"cbefa690-0d0b-4c8e-8eda-ea618b034982\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.383736 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbefa690-0d0b-4c8e-8eda-ea618b034982-kube-api-access\") pod \"cbefa690-0d0b-4c8e-8eda-ea618b034982\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.383779 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-kubelet-dir\") pod \"cbefa690-0d0b-4c8e-8eda-ea618b034982\" (UID: \"cbefa690-0d0b-4c8e-8eda-ea618b034982\") " Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.383780 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-var-lock" (OuterVolumeSpecName: "var-lock") pod "cbefa690-0d0b-4c8e-8eda-ea618b034982" (UID: "cbefa690-0d0b-4c8e-8eda-ea618b034982"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.384093 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cbefa690-0d0b-4c8e-8eda-ea618b034982" (UID: "cbefa690-0d0b-4c8e-8eda-ea618b034982"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.384487 4874 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.384511 4874 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbefa690-0d0b-4c8e-8eda-ea618b034982-var-lock\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.393309 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbefa690-0d0b-4c8e-8eda-ea618b034982-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cbefa690-0d0b-4c8e-8eda-ea618b034982" (UID: "cbefa690-0d0b-4c8e-8eda-ea618b034982"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.485216 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbefa690-0d0b-4c8e-8eda-ea618b034982-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.974470 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbefa690-0d0b-4c8e-8eda-ea618b034982","Type":"ContainerDied","Data":"d9a86f684f9a70b9f92e155ca4b39422325aff50e91a7f48fc7d254cd7e398f3"} Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.974551 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a86f684f9a70b9f92e155ca4b39422325aff50e91a7f48fc7d254cd7e398f3" Jan 26 00:11:12 crc kubenswrapper[4874]: I0126 00:11:12.974592 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.004194 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.004905 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.005657 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.006155 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.006931 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.009128 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.009517 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.009892 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: I0126 00:11:13.010201 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:13 crc kubenswrapper[4874]: E0126 00:11:13.918192 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="3.2s" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.626333 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.627716 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.628423 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.628907 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.629419 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.629816 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.630535 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.631335 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:15 crc kubenswrapper[4874]: I0126 00:11:15.631841 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:17 crc kubenswrapper[4874]: I0126 00:11:17.105029 4874 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 26 00:11:17 crc kubenswrapper[4874]: I0126 00:11:17.105115 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 26 00:11:17 crc kubenswrapper[4874]: E0126 00:11:17.119832 4874 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="6.4s" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.011881 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.012339 4874 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a" exitCode=1 Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.012391 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a"} Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.013105 4874 scope.go:117] "RemoveContainer" containerID="969e8519f3ad566f2a0645cbf0e54d4a3eb21ae1703f13af08c582f9856e372a" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.013581 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.014061 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.014342 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.014789 4874 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.015452 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.015698 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.016017 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.016262 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.021141 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.021468 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:18 crc kubenswrapper[4874]: I0126 00:11:18.827858 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.030701 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.030831 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"804fcce02bcc8f642cfa360bb9d88a9874818aa011d7ae8fc0c83de1405de75e"} Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.032145 4874 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.032745 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.033592 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.034388 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.034876 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.035538 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.036318 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.036928 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.037422 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.038055 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.620380 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.621588 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.622286 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.622658 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.623171 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.623703 4874 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.624136 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.624500 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.624962 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.625372 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.625672 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.646490 4874 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.646528 4874 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:19 crc kubenswrapper[4874]: E0126 00:11:19.647188 4874 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:19 crc kubenswrapper[4874]: I0126 00:11:19.648020 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:19 crc kubenswrapper[4874]: W0126 00:11:19.680092 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-cddf6aa37d7f615211bd3454fa38d1bc12b221ffffa75afc5b5dfbcb0846fa8d WatchSource:0}: Error finding container cddf6aa37d7f615211bd3454fa38d1bc12b221ffffa75afc5b5dfbcb0846fa8d: Status 404 returned error can't find the container with id cddf6aa37d7f615211bd3454fa38d1bc12b221ffffa75afc5b5dfbcb0846fa8d Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.043244 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8fb7349dccccc7d47b3bf19267119efb234d3b04550647f756f7cd9105657908"} Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.043777 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cddf6aa37d7f615211bd3454fa38d1bc12b221ffffa75afc5b5dfbcb0846fa8d"} Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.044584 4874 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.044625 4874 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:20 crc kubenswrapper[4874]: E0126 00:11:20.045463 4874 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.046418 4874 status_manager.go:851] "Failed to get status for pod" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" pod="openshift-marketplace/certified-operators-llwxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-llwxh\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.047414 4874 status_manager.go:851] "Failed to get status for pod" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.048460 4874 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.049139 4874 status_manager.go:851] "Failed to get status for pod" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" pod="openshift-marketplace/redhat-operators-bdrl5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bdrl5\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.049778 4874 status_manager.go:851] "Failed to get status for pod" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" pod="openshift-marketplace/community-operators-qx779" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-qx779\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.050196 4874 status_manager.go:851] "Failed to get status for pod" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-ffp42\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.050523 4874 status_manager.go:851] "Failed to get status for pod" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" pod="openshift-authentication/oauth-openshift-558db77b4-8k592" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8k592\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.051181 4874 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.051683 4874 status_manager.go:851] "Failed to get status for pod" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" pod="openshift-marketplace/redhat-marketplace-s2dns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-s2dns\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:20 crc kubenswrapper[4874]: I0126 00:11:20.052223 4874 status_manager.go:851] "Failed to get status for pod" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" pod="openshift-marketplace/certified-operators-ps2v4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ps2v4\": dial tcp 38.102.83.198:6443: connect: connection refused" Jan 26 00:11:21 crc kubenswrapper[4874]: I0126 00:11:21.070730 4874 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8fb7349dccccc7d47b3bf19267119efb234d3b04550647f756f7cd9105657908" exitCode=0 Jan 26 00:11:21 crc kubenswrapper[4874]: I0126 00:11:21.070780 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8fb7349dccccc7d47b3bf19267119efb234d3b04550647f756f7cd9105657908"} Jan 26 00:11:21 crc kubenswrapper[4874]: I0126 00:11:21.070812 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d4fdb412afdb32e33c3e539d3124202ab089693133655e0b5e0d6d1b03d86dd9"} Jan 26 00:11:21 crc kubenswrapper[4874]: I0126 00:11:21.070821 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"187eb81f49f3f11cf5791e1e17f0e0e059400b9172a4c6c1c669a17886e49b61"} Jan 26 00:11:22 crc kubenswrapper[4874]: I0126 00:11:22.094752 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e32daeff70c3fd1588e071ac71359d7f4a3e4f4058ff8753a8ed925f5389ec9"} Jan 26 00:11:22 crc kubenswrapper[4874]: I0126 00:11:22.095092 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8fa9380c7d4529255afdc1eeab73c71d07cb6157e98867cfe764f89759a9b32e"} Jan 26 00:11:22 crc kubenswrapper[4874]: I0126 00:11:22.095107 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8f016f29850b84d3d6419b292290f096cfa839c0cc6feca3dd6188d4cf7e71ca"} Jan 26 00:11:22 crc kubenswrapper[4874]: I0126 00:11:22.095400 4874 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:22 crc kubenswrapper[4874]: I0126 00:11:22.095417 4874 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:22 crc kubenswrapper[4874]: I0126 00:11:22.095594 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:23 crc kubenswrapper[4874]: I0126 00:11:23.799161 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:11:23 crc kubenswrapper[4874]: I0126 00:11:23.805608 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:11:24 crc kubenswrapper[4874]: I0126 00:11:24.107087 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:11:24 crc kubenswrapper[4874]: I0126 00:11:24.649185 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:24 crc kubenswrapper[4874]: I0126 00:11:24.649250 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:24 crc kubenswrapper[4874]: I0126 00:11:24.657399 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:27 crc kubenswrapper[4874]: I0126 00:11:27.102485 4874 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:27 crc kubenswrapper[4874]: I0126 00:11:27.124839 4874 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:27 crc kubenswrapper[4874]: I0126 00:11:27.124877 4874 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:27 crc kubenswrapper[4874]: I0126 00:11:27.128837 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:27 crc kubenswrapper[4874]: I0126 00:11:27.377091 4874 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4b62b6aa-44c2-4545-a581-53e5722828ff" Jan 26 00:11:28 crc kubenswrapper[4874]: I0126 00:11:28.130211 4874 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:28 crc kubenswrapper[4874]: I0126 00:11:28.130256 4874 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:28 crc kubenswrapper[4874]: I0126 00:11:28.133502 4874 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4b62b6aa-44c2-4545-a581-53e5722828ff" Jan 26 00:11:28 crc kubenswrapper[4874]: I0126 00:11:28.838287 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 26 00:11:36 crc kubenswrapper[4874]: I0126 00:11:36.599243 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 26 00:11:37 crc kubenswrapper[4874]: I0126 00:11:37.563089 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 26 00:11:37 crc kubenswrapper[4874]: I0126 00:11:37.781053 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 26 00:11:38 crc kubenswrapper[4874]: I0126 00:11:38.051774 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 26 00:11:38 crc kubenswrapper[4874]: I0126 00:11:38.389347 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 26 00:11:38 crc kubenswrapper[4874]: I0126 00:11:38.675426 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 26 00:11:38 crc kubenswrapper[4874]: I0126 00:11:38.693676 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 26 00:11:38 crc kubenswrapper[4874]: I0126 00:11:38.757303 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 26 00:11:39 crc kubenswrapper[4874]: I0126 00:11:39.155569 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 26 00:11:39 crc kubenswrapper[4874]: I0126 00:11:39.227164 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 26 00:11:39 crc kubenswrapper[4874]: I0126 00:11:39.697241 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 26 00:11:39 crc kubenswrapper[4874]: I0126 00:11:39.739731 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 26 00:11:39 crc kubenswrapper[4874]: I0126 00:11:39.886113 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 26 00:11:39 crc kubenswrapper[4874]: I0126 00:11:39.929723 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 26 00:11:39 crc kubenswrapper[4874]: I0126 00:11:39.931407 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.304236 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.357699 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.384027 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.473845 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.510295 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.513892 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.591174 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.660273 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.811072 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.847888 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 26 00:11:40 crc kubenswrapper[4874]: I0126 00:11:40.895518 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.102347 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.123012 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.184559 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.226165 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.322847 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.339080 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.414339 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.503406 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.591267 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.591356 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.603833 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.698256 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.725692 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.738309 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.750049 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.846365 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.854374 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.863268 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.914194 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 26 00:11:41 crc kubenswrapper[4874]: I0126 00:11:41.988215 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.050140 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.126334 4874 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.131671 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.132581 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.181029 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.192431 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.247870 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.264793 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.306035 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.350668 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.462248 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.469258 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.490892 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.614592 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.673229 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.703614 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.717392 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.732763 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.847880 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.851332 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 26 00:11:42 crc kubenswrapper[4874]: I0126 00:11:42.984245 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.107520 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.199429 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.232546 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.273487 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.314745 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.365006 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.400512 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.497858 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.508243 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.669282 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.841149 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.895847 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 26 00:11:43 crc kubenswrapper[4874]: I0126 00:11:43.994766 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.028407 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.236335 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.273482 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.291404 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.454365 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.521007 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.560935 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.614483 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.627698 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.635337 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.697491 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.743179 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.792166 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.825023 4874 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.921869 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 26 00:11:44 crc kubenswrapper[4874]: I0126 00:11:44.924480 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.097003 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.134745 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.243913 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.246876 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.282030 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.353541 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.408432 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.419702 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.460573 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.488104 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.502895 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.504337 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.791355 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.814694 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.900857 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.932331 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.941313 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.946294 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 26 00:11:45 crc kubenswrapper[4874]: I0126 00:11:45.960150 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.006862 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.029380 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.080926 4874 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.130289 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.177763 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.367745 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.428921 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.441312 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.478904 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.522621 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.578750 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.589721 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.606213 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.607769 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.665049 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.758700 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 26 00:11:46 crc kubenswrapper[4874]: I0126 00:11:46.771599 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.069211 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.077380 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.079669 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.117258 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.130821 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.136826 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.163521 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.168587 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.178268 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.262747 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.276912 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.308072 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.403271 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.425711 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.471873 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.571555 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.626745 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.691194 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.747210 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.777578 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.859315 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.920634 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.948371 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 26 00:11:47 crc kubenswrapper[4874]: I0126 00:11:47.962236 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.000263 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.014202 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.025938 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.139604 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.274214 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.290498 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.372997 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.394255 4874 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.419530 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.465703 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.498648 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.532747 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.551521 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.552034 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.555380 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.583230 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.656261 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 26 00:11:48 crc kubenswrapper[4874]: I0126 00:11:48.862439 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.135158 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.212405 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.261058 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.284286 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.333299 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.395350 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.469630 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.489985 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.497696 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.568341 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.571464 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.571636 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.571743 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.616168 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.662684 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.663298 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.677471 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.713346 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.739893 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.769304 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.816492 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.842898 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.860816 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.875258 4874 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.880423 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.880397061 podStartE2EDuration="45.880397061s" podCreationTimestamp="2026-01-26 00:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:11:27.172150931 +0000 UTC m=+261.906257468" watchObservedRunningTime="2026-01-26 00:11:49.880397061 +0000 UTC m=+284.614503638" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.881083 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.884920 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-8k592"] Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.885035 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-v7sv8","openshift-kube-apiserver/kube-apiserver-crc"] Jan 26 00:11:49 crc kubenswrapper[4874]: E0126 00:11:49.885352 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" containerName="installer" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.885384 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" containerName="installer" Jan 26 00:11:49 crc kubenswrapper[4874]: E0126 00:11:49.885409 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" containerName="oauth-openshift" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.885423 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" containerName="oauth-openshift" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.885535 4874 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.885614 4874 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="262b9da0-a082-4538-af3b-2c747e51e156" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.885615 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" containerName="oauth-openshift" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.885779 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbefa690-0d0b-4c8e-8eda-ea618b034982" containerName="installer" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.886639 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.890735 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.891169 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.892255 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.896171 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.896261 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.896178 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.896366 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.896617 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.896803 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.896867 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.897202 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.897415 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.898494 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.907275 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.909307 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.935768 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.935738962 podStartE2EDuration="22.935738962s" podCreationTimestamp="2026-01-26 00:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:11:49.928026702 +0000 UTC m=+284.662133259" watchObservedRunningTime="2026-01-26 00:11:49.935738962 +0000 UTC m=+284.669845499" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.937301 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.946838 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.974696 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.974776 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.974812 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.974838 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-audit-policies\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.974861 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.974895 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975017 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975087 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj22g\" (UniqueName: \"kubernetes.io/projected/5803105d-d9f1-4605-bb41-fe1fed078e7d-kube-api-access-bj22g\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975128 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975175 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5803105d-d9f1-4605-bb41-fe1fed078e7d-audit-dir\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975209 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975339 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975411 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.975478 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:49 crc kubenswrapper[4874]: I0126 00:11:49.979679 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.066619 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.075336 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076622 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076707 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076741 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj22g\" (UniqueName: \"kubernetes.io/projected/5803105d-d9f1-4605-bb41-fe1fed078e7d-kube-api-access-bj22g\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076779 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076846 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5803105d-d9f1-4605-bb41-fe1fed078e7d-audit-dir\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076877 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076915 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.076978 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.077008 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.077041 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.077078 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.077100 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.077123 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-audit-policies\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.077142 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.078104 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.078537 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5803105d-d9f1-4605-bb41-fe1fed078e7d-audit-dir\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.078632 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.079260 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-audit-policies\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.079512 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.084779 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.085006 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.085057 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.086480 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.087018 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.087531 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.088826 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.090110 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5803105d-d9f1-4605-bb41-fe1fed078e7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.096205 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj22g\" (UniqueName: \"kubernetes.io/projected/5803105d-d9f1-4605-bb41-fe1fed078e7d-kube-api-access-bj22g\") pod \"oauth-openshift-66c58d94fc-v7sv8\" (UID: \"5803105d-d9f1-4605-bb41-fe1fed078e7d\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.206087 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.255097 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.401536 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-v7sv8"] Jan 26 00:11:50 crc kubenswrapper[4874]: W0126 00:11:50.411258 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5803105d_d9f1_4605_bb41_fe1fed078e7d.slice/crio-12229aa13f9282b0fc72c1fef6e5bd48fdd0377fe2cf286fea01fb0cc2cd89c6 WatchSource:0}: Error finding container 12229aa13f9282b0fc72c1fef6e5bd48fdd0377fe2cf286fea01fb0cc2cd89c6: Status 404 returned error can't find the container with id 12229aa13f9282b0fc72c1fef6e5bd48fdd0377fe2cf286fea01fb0cc2cd89c6 Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.450711 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.452355 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.463120 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.513866 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.600486 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.617503 4874 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.689693 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.860605 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.900798 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.935754 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 26 00:11:50 crc kubenswrapper[4874]: I0126 00:11:50.971930 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.015592 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.031035 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.153320 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.278513 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66c58d94fc-v7sv8_5803105d-d9f1-4605-bb41-fe1fed078e7d/oauth-openshift/0.log" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.278562 4874 generic.go:334] "Generic (PLEG): container finished" podID="5803105d-d9f1-4605-bb41-fe1fed078e7d" containerID="283484de920fd7d27a075e4ec2d61ae9cbc77f6b575a5e843f41f89002c11281" exitCode=255 Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.278674 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" event={"ID":"5803105d-d9f1-4605-bb41-fe1fed078e7d","Type":"ContainerDied","Data":"283484de920fd7d27a075e4ec2d61ae9cbc77f6b575a5e843f41f89002c11281"} Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.278734 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" event={"ID":"5803105d-d9f1-4605-bb41-fe1fed078e7d","Type":"ContainerStarted","Data":"12229aa13f9282b0fc72c1fef6e5bd48fdd0377fe2cf286fea01fb0cc2cd89c6"} Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.279480 4874 scope.go:117] "RemoveContainer" containerID="283484de920fd7d27a075e4ec2d61ae9cbc77f6b575a5e843f41f89002c11281" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.302139 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.335121 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.538664 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.628602 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabfc520-9523-45b8-8be1-57e71788c5a7" path="/var/lib/kubelet/pods/cabfc520-9523-45b8-8be1-57e71788c5a7/volumes" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.918135 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.945074 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 26 00:11:51 crc kubenswrapper[4874]: I0126 00:11:51.971393 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.080243 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.170213 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.229723 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.285776 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66c58d94fc-v7sv8_5803105d-d9f1-4605-bb41-fe1fed078e7d/oauth-openshift/0.log" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.285833 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" event={"ID":"5803105d-d9f1-4605-bb41-fe1fed078e7d","Type":"ContainerStarted","Data":"d76ed386b934ee84ad445cf3add21b799729afc51350d2882502353bd47de939"} Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.286183 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.287137 4874 patch_prober.go:28] interesting pod/oauth-openshift-66c58d94fc-v7sv8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" start-of-body= Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.287184 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" podUID="5803105d-d9f1-4605-bb41-fe1fed078e7d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.305262 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" podStartSLOduration=82.305243403 podStartE2EDuration="1m22.305243403s" podCreationTimestamp="2026-01-26 00:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:11:52.302131636 +0000 UTC m=+287.036238173" watchObservedRunningTime="2026-01-26 00:11:52.305243403 +0000 UTC m=+287.039349940" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.591513 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.596019 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.907733 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.922520 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.939068 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.941769 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 26 00:11:52 crc kubenswrapper[4874]: I0126 00:11:52.956534 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.292619 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66c58d94fc-v7sv8_5803105d-d9f1-4605-bb41-fe1fed078e7d/oauth-openshift/1.log" Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.293521 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66c58d94fc-v7sv8_5803105d-d9f1-4605-bb41-fe1fed078e7d/oauth-openshift/0.log" Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.293576 4874 generic.go:334] "Generic (PLEG): container finished" podID="5803105d-d9f1-4605-bb41-fe1fed078e7d" containerID="d76ed386b934ee84ad445cf3add21b799729afc51350d2882502353bd47de939" exitCode=255 Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.293610 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" event={"ID":"5803105d-d9f1-4605-bb41-fe1fed078e7d","Type":"ContainerDied","Data":"d76ed386b934ee84ad445cf3add21b799729afc51350d2882502353bd47de939"} Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.293655 4874 scope.go:117] "RemoveContainer" containerID="283484de920fd7d27a075e4ec2d61ae9cbc77f6b575a5e843f41f89002c11281" Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.294163 4874 scope.go:117] "RemoveContainer" containerID="d76ed386b934ee84ad445cf3add21b799729afc51350d2882502353bd47de939" Jan 26 00:11:53 crc kubenswrapper[4874]: E0126 00:11:53.294452 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-66c58d94fc-v7sv8_openshift-authentication(5803105d-d9f1-4605-bb41-fe1fed078e7d)\"" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" podUID="5803105d-d9f1-4605-bb41-fe1fed078e7d" Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.649361 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 26 00:11:53 crc kubenswrapper[4874]: I0126 00:11:53.678228 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 26 00:11:54 crc kubenswrapper[4874]: I0126 00:11:54.302804 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66c58d94fc-v7sv8_5803105d-d9f1-4605-bb41-fe1fed078e7d/oauth-openshift/1.log" Jan 26 00:11:54 crc kubenswrapper[4874]: I0126 00:11:54.303648 4874 scope.go:117] "RemoveContainer" containerID="d76ed386b934ee84ad445cf3add21b799729afc51350d2882502353bd47de939" Jan 26 00:11:54 crc kubenswrapper[4874]: E0126 00:11:54.304111 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-66c58d94fc-v7sv8_openshift-authentication(5803105d-d9f1-4605-bb41-fe1fed078e7d)\"" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" podUID="5803105d-d9f1-4605-bb41-fe1fed078e7d" Jan 26 00:11:59 crc kubenswrapper[4874]: I0126 00:11:59.906654 4874 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 26 00:11:59 crc kubenswrapper[4874]: I0126 00:11:59.907507 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53" gracePeriod=5 Jan 26 00:12:00 crc kubenswrapper[4874]: I0126 00:12:00.206310 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:12:00 crc kubenswrapper[4874]: I0126 00:12:00.207860 4874 scope.go:117] "RemoveContainer" containerID="d76ed386b934ee84ad445cf3add21b799729afc51350d2882502353bd47de939" Jan 26 00:12:00 crc kubenswrapper[4874]: E0126 00:12:00.208495 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-66c58d94fc-v7sv8_openshift-authentication(5803105d-d9f1-4605-bb41-fe1fed078e7d)\"" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" podUID="5803105d-d9f1-4605-bb41-fe1fed078e7d" Jan 26 00:12:05 crc kubenswrapper[4874]: I0126 00:12:05.426362 4874 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.303694 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.304151 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.380424 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.380582 4874 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53" exitCode=137 Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.380698 4874 scope.go:117] "RemoveContainer" containerID="f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.380716 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.403439 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.403577 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.403656 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.403785 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.403849 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.408153 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.408244 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.408254 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.408279 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.416553 4874 scope.go:117] "RemoveContainer" containerID="f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.417699 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:12:06 crc kubenswrapper[4874]: E0126 00:12:06.418745 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53\": container with ID starting with f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53 not found: ID does not exist" containerID="f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.418798 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53"} err="failed to get container status \"f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53\": rpc error: code = NotFound desc = could not find container \"f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53\": container with ID starting with f9fbc47e4a15c79d05adb523d2a0b7d9027c2fd3775dfa495826564ef2df0c53 not found: ID does not exist" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.508177 4874 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.508222 4874 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.508235 4874 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.508245 4874 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:06 crc kubenswrapper[4874]: I0126 00:12:06.508258 4874 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.390545 4874 generic.go:334] "Generic (PLEG): container finished" podID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerID="334b5aea9863f53fcc5738bcac353ee91b6959e47bdc154b420c1f8104992b4e" exitCode=0 Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.390663 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" event={"ID":"8aa0d0bc-1d01-402d-a9aa-97a984717116","Type":"ContainerDied","Data":"334b5aea9863f53fcc5738bcac353ee91b6959e47bdc154b420c1f8104992b4e"} Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.391428 4874 scope.go:117] "RemoveContainer" containerID="334b5aea9863f53fcc5738bcac353ee91b6959e47bdc154b420c1f8104992b4e" Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.629221 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.630204 4874 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.648011 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.648069 4874 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="56c8b234-caed-4e1f-83a2-b072b5fcea4c" Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.653992 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 26 00:12:07 crc kubenswrapper[4874]: I0126 00:12:07.654061 4874 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="56c8b234-caed-4e1f-83a2-b072b5fcea4c" Jan 26 00:12:08 crc kubenswrapper[4874]: I0126 00:12:08.401094 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" event={"ID":"8aa0d0bc-1d01-402d-a9aa-97a984717116","Type":"ContainerStarted","Data":"2fca9c432c38d8d0a50bbff06297e1234e088668ed43513c7b40288310fb8a3e"} Jan 26 00:12:08 crc kubenswrapper[4874]: I0126 00:12:08.401868 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:12:08 crc kubenswrapper[4874]: I0126 00:12:08.406420 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:12:11 crc kubenswrapper[4874]: I0126 00:12:11.621099 4874 scope.go:117] "RemoveContainer" containerID="d76ed386b934ee84ad445cf3add21b799729afc51350d2882502353bd47de939" Jan 26 00:12:15 crc kubenswrapper[4874]: I0126 00:12:15.466742 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-66c58d94fc-v7sv8_5803105d-d9f1-4605-bb41-fe1fed078e7d/oauth-openshift/1.log" Jan 26 00:12:15 crc kubenswrapper[4874]: I0126 00:12:15.467442 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" event={"ID":"5803105d-d9f1-4605-bb41-fe1fed078e7d","Type":"ContainerStarted","Data":"41a9e4b2668aa74b98ffae202f0bc2e24ee065877d8fb7b9b6a5180714e7ccda"} Jan 26 00:12:15 crc kubenswrapper[4874]: I0126 00:12:15.467864 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:12:15 crc kubenswrapper[4874]: I0126 00:12:15.476107 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66c58d94fc-v7sv8" Jan 26 00:12:20 crc kubenswrapper[4874]: I0126 00:12:20.861548 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.176653 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7zdk"] Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.177488 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m7zdk" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="registry-server" containerID="cri-o://b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8" gracePeriod=2 Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.643897 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.657374 4874 generic.go:334] "Generic (PLEG): container finished" podID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerID="b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8" exitCode=0 Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.657436 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7zdk" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.657432 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7zdk" event={"ID":"b7b05ff9-4558-42dc-9ca7-64d2342a002c","Type":"ContainerDied","Data":"b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8"} Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.657496 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7zdk" event={"ID":"b7b05ff9-4558-42dc-9ca7-64d2342a002c","Type":"ContainerDied","Data":"5636a3469c98cb2cea3ccc2c112885d035c7db0bc293d21c6949f2be716e5ceb"} Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.657532 4874 scope.go:117] "RemoveContainer" containerID="b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.689116 4874 scope.go:117] "RemoveContainer" containerID="27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.708212 4874 scope.go:117] "RemoveContainer" containerID="fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.729545 4874 scope.go:117] "RemoveContainer" containerID="b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8" Jan 26 00:12:42 crc kubenswrapper[4874]: E0126 00:12:42.729986 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8\": container with ID starting with b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8 not found: ID does not exist" containerID="b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.730029 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8"} err="failed to get container status \"b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8\": rpc error: code = NotFound desc = could not find container \"b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8\": container with ID starting with b43debbfcb5ba6b7054713ed8e6c9208ac2880928fde226c27cf805558adb6a8 not found: ID does not exist" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.730062 4874 scope.go:117] "RemoveContainer" containerID="27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc" Jan 26 00:12:42 crc kubenswrapper[4874]: E0126 00:12:42.730443 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc\": container with ID starting with 27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc not found: ID does not exist" containerID="27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.730480 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc"} err="failed to get container status \"27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc\": rpc error: code = NotFound desc = could not find container \"27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc\": container with ID starting with 27b602a18c3c20ecab68e13a2dcad92f730fb665924ad39d93b6b2e7ce74c8dc not found: ID does not exist" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.730513 4874 scope.go:117] "RemoveContainer" containerID="fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7" Jan 26 00:12:42 crc kubenswrapper[4874]: E0126 00:12:42.730785 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7\": container with ID starting with fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7 not found: ID does not exist" containerID="fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.730810 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7"} err="failed to get container status \"fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7\": rpc error: code = NotFound desc = could not find container \"fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7\": container with ID starting with fe51a1fda1009b793a88d8915e47a67b58242f6fc8b01d311f0ece046713afa7 not found: ID does not exist" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.783997 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2mq5\" (UniqueName: \"kubernetes.io/projected/b7b05ff9-4558-42dc-9ca7-64d2342a002c-kube-api-access-z2mq5\") pod \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.784157 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-catalog-content\") pod \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.784183 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-utilities\") pod \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\" (UID: \"b7b05ff9-4558-42dc-9ca7-64d2342a002c\") " Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.785432 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-utilities" (OuterVolumeSpecName: "utilities") pod "b7b05ff9-4558-42dc-9ca7-64d2342a002c" (UID: "b7b05ff9-4558-42dc-9ca7-64d2342a002c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.788778 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b05ff9-4558-42dc-9ca7-64d2342a002c-kube-api-access-z2mq5" (OuterVolumeSpecName: "kube-api-access-z2mq5") pod "b7b05ff9-4558-42dc-9ca7-64d2342a002c" (UID: "b7b05ff9-4558-42dc-9ca7-64d2342a002c"). InnerVolumeSpecName "kube-api-access-z2mq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.813160 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7b05ff9-4558-42dc-9ca7-64d2342a002c" (UID: "b7b05ff9-4558-42dc-9ca7-64d2342a002c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.885448 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.885526 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b05ff9-4558-42dc-9ca7-64d2342a002c-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:42 crc kubenswrapper[4874]: I0126 00:12:42.885545 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2mq5\" (UniqueName: \"kubernetes.io/projected/b7b05ff9-4558-42dc-9ca7-64d2342a002c-kube-api-access-z2mq5\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:43 crc kubenswrapper[4874]: I0126 00:12:43.008945 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7zdk"] Jan 26 00:12:43 crc kubenswrapper[4874]: I0126 00:12:43.013210 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7zdk"] Jan 26 00:12:43 crc kubenswrapper[4874]: I0126 00:12:43.632464 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" path="/var/lib/kubelet/pods/b7b05ff9-4558-42dc-9ca7-64d2342a002c/volumes" Jan 26 00:12:44 crc kubenswrapper[4874]: I0126 00:12:44.391253 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pns4h"] Jan 26 00:12:44 crc kubenswrapper[4874]: I0126 00:12:44.391522 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" podUID="960d80ae-7893-4dde-a757-fb45b8725c8e" containerName="controller-manager" containerID="cri-o://08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e" gracePeriod=30 Jan 26 00:12:44 crc kubenswrapper[4874]: I0126 00:12:44.512144 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6"] Jan 26 00:12:44 crc kubenswrapper[4874]: I0126 00:12:44.512377 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" podUID="fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" containerName="route-controller-manager" containerID="cri-o://d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165" gracePeriod=30 Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.414826 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.469261 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.517704 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-client-ca\") pod \"960d80ae-7893-4dde-a757-fb45b8725c8e\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.517793 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960d80ae-7893-4dde-a757-fb45b8725c8e-serving-cert\") pod \"960d80ae-7893-4dde-a757-fb45b8725c8e\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.517829 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-config\") pod \"960d80ae-7893-4dde-a757-fb45b8725c8e\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.517904 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-proxy-ca-bundles\") pod \"960d80ae-7893-4dde-a757-fb45b8725c8e\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.518123 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gntmq\" (UniqueName: \"kubernetes.io/projected/960d80ae-7893-4dde-a757-fb45b8725c8e-kube-api-access-gntmq\") pod \"960d80ae-7893-4dde-a757-fb45b8725c8e\" (UID: \"960d80ae-7893-4dde-a757-fb45b8725c8e\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.518805 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-client-ca" (OuterVolumeSpecName: "client-ca") pod "960d80ae-7893-4dde-a757-fb45b8725c8e" (UID: "960d80ae-7893-4dde-a757-fb45b8725c8e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.518879 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "960d80ae-7893-4dde-a757-fb45b8725c8e" (UID: "960d80ae-7893-4dde-a757-fb45b8725c8e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.518919 4874 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.518906 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-config" (OuterVolumeSpecName: "config") pod "960d80ae-7893-4dde-a757-fb45b8725c8e" (UID: "960d80ae-7893-4dde-a757-fb45b8725c8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.522714 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/960d80ae-7893-4dde-a757-fb45b8725c8e-kube-api-access-gntmq" (OuterVolumeSpecName: "kube-api-access-gntmq") pod "960d80ae-7893-4dde-a757-fb45b8725c8e" (UID: "960d80ae-7893-4dde-a757-fb45b8725c8e"). InnerVolumeSpecName "kube-api-access-gntmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.524030 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/960d80ae-7893-4dde-a757-fb45b8725c8e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "960d80ae-7893-4dde-a757-fb45b8725c8e" (UID: "960d80ae-7893-4dde-a757-fb45b8725c8e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.619975 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-serving-cert\") pod \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.620070 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-config\") pod \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.620110 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4ltk\" (UniqueName: \"kubernetes.io/projected/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-kube-api-access-j4ltk\") pod \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.620231 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-client-ca\") pod \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\" (UID: \"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f\") " Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.620667 4874 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.620741 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gntmq\" (UniqueName: \"kubernetes.io/projected/960d80ae-7893-4dde-a757-fb45b8725c8e-kube-api-access-gntmq\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.620761 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960d80ae-7893-4dde-a757-fb45b8725c8e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.620859 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960d80ae-7893-4dde-a757-fb45b8725c8e-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.621330 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-client-ca" (OuterVolumeSpecName: "client-ca") pod "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" (UID: "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.621344 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-config" (OuterVolumeSpecName: "config") pod "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" (UID: "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.624676 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" (UID: "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.624853 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-kube-api-access-j4ltk" (OuterVolumeSpecName: "kube-api-access-j4ltk") pod "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" (UID: "fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f"). InnerVolumeSpecName "kube-api-access-j4ltk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.675623 4874 generic.go:334] "Generic (PLEG): container finished" podID="960d80ae-7893-4dde-a757-fb45b8725c8e" containerID="08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e" exitCode=0 Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.675701 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" event={"ID":"960d80ae-7893-4dde-a757-fb45b8725c8e","Type":"ContainerDied","Data":"08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e"} Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.675719 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.675743 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pns4h" event={"ID":"960d80ae-7893-4dde-a757-fb45b8725c8e","Type":"ContainerDied","Data":"2ba63b5502bc0c1a7b45584aa5a0b4fb2f4e88c035581728f552f10712673ed5"} Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.675768 4874 scope.go:117] "RemoveContainer" containerID="08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.678508 4874 generic.go:334] "Generic (PLEG): container finished" podID="fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" containerID="d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165" exitCode=0 Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.678562 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" event={"ID":"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f","Type":"ContainerDied","Data":"d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165"} Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.678587 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.678592 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6" event={"ID":"fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f","Type":"ContainerDied","Data":"76936ddede7b94a617f20b2d0e6f2326548be5378ddaf97469011d5e527767af"} Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.703750 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pns4h"] Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.704558 4874 scope.go:117] "RemoveContainer" containerID="08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e" Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.705158 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e\": container with ID starting with 08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e not found: ID does not exist" containerID="08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.705188 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e"} err="failed to get container status \"08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e\": rpc error: code = NotFound desc = could not find container \"08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e\": container with ID starting with 08b3d9dbf89db58b17da7bb12c99b1ef7ae6de2dc4fafdd0959afbf47f303f5e not found: ID does not exist" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.705214 4874 scope.go:117] "RemoveContainer" containerID="d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.710548 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pns4h"] Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.714417 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6"] Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.717463 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rrm6"] Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.721315 4874 scope.go:117] "RemoveContainer" containerID="d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.721801 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.721819 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4ltk\" (UniqueName: \"kubernetes.io/projected/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-kube-api-access-j4ltk\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.721827 4874 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.721835 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.722598 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165\": container with ID starting with d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165 not found: ID does not exist" containerID="d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.722663 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165"} err="failed to get container status \"d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165\": rpc error: code = NotFound desc = could not find container \"d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165\": container with ID starting with d714068503dc292f064746c9974805967aa35e55d823855d79dc1da45fb21165 not found: ID does not exist" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.809002 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr"] Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.811222 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="960d80ae-7893-4dde-a757-fb45b8725c8e" containerName="controller-manager" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811254 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="960d80ae-7893-4dde-a757-fb45b8725c8e" containerName="controller-manager" Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.811265 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811271 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.811281 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" containerName="route-controller-manager" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811287 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" containerName="route-controller-manager" Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.811300 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="extract-content" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811306 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="extract-content" Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.811322 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="extract-utilities" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811332 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="extract-utilities" Jan 26 00:12:45 crc kubenswrapper[4874]: E0126 00:12:45.811346 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="registry-server" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811354 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="registry-server" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811480 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811493 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b05ff9-4558-42dc-9ca7-64d2342a002c" containerName="registry-server" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811510 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="960d80ae-7893-4dde-a757-fb45b8725c8e" containerName="controller-manager" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811523 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" containerName="route-controller-manager" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.811928 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.814524 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.814656 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.815285 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.815844 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.816146 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.816174 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.819003 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp"] Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.819815 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.822353 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.822730 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.822891 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823116 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823286 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.822735 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-config\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823405 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnzcq\" (UniqueName: \"kubernetes.io/projected/2d1ecab2-53e4-407e-b14a-3b613ed74612-kube-api-access-vnzcq\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823434 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313bad0d-b020-4953-a14b-2f2b191888ea-serving-cert\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823464 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndzg\" (UniqueName: \"kubernetes.io/projected/313bad0d-b020-4953-a14b-2f2b191888ea-kube-api-access-zndzg\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823492 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-client-ca\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823538 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1ecab2-53e4-407e-b14a-3b613ed74612-serving-cert\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.823702 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.825621 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-proxy-ca-bundles\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.825734 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-client-ca\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.825790 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-config\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.832546 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr"] Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.837938 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp"] Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.847219 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.927646 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-config\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928020 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnzcq\" (UniqueName: \"kubernetes.io/projected/2d1ecab2-53e4-407e-b14a-3b613ed74612-kube-api-access-vnzcq\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928051 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313bad0d-b020-4953-a14b-2f2b191888ea-serving-cert\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928076 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndzg\" (UniqueName: \"kubernetes.io/projected/313bad0d-b020-4953-a14b-2f2b191888ea-kube-api-access-zndzg\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928097 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-client-ca\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928122 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1ecab2-53e4-407e-b14a-3b613ed74612-serving-cert\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928149 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-proxy-ca-bundles\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928190 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-client-ca\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.928223 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-config\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.929395 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-client-ca\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.929440 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-config\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.929754 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-config\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.929882 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-client-ca\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.930040 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-proxy-ca-bundles\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.932275 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313bad0d-b020-4953-a14b-2f2b191888ea-serving-cert\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.932388 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1ecab2-53e4-407e-b14a-3b613ed74612-serving-cert\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.948070 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnzcq\" (UniqueName: \"kubernetes.io/projected/2d1ecab2-53e4-407e-b14a-3b613ed74612-kube-api-access-vnzcq\") pod \"controller-manager-6dbd4c5b74-52xsr\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:45 crc kubenswrapper[4874]: I0126 00:12:45.952389 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndzg\" (UniqueName: \"kubernetes.io/projected/313bad0d-b020-4953-a14b-2f2b191888ea-kube-api-access-zndzg\") pod \"route-controller-manager-778677f976-wsdqp\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.148580 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.155998 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.376795 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr"] Jan 26 00:12:46 crc kubenswrapper[4874]: W0126 00:12:46.412919 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod313bad0d_b020_4953_a14b_2f2b191888ea.slice/crio-a9a638312227fd384a3f42b68c4acdbc708ffc2896a7701b2cf7d52b75474a6a WatchSource:0}: Error finding container a9a638312227fd384a3f42b68c4acdbc708ffc2896a7701b2cf7d52b75474a6a: Status 404 returned error can't find the container with id a9a638312227fd384a3f42b68c4acdbc708ffc2896a7701b2cf7d52b75474a6a Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.420222 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp"] Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.687426 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" event={"ID":"2d1ecab2-53e4-407e-b14a-3b613ed74612","Type":"ContainerStarted","Data":"2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca"} Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.687781 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" event={"ID":"2d1ecab2-53e4-407e-b14a-3b613ed74612","Type":"ContainerStarted","Data":"adc1e8eed40bfc77f89b5781cead417ba60192c0ad87e8d25ab65ed950fa888b"} Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.690627 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" event={"ID":"313bad0d-b020-4953-a14b-2f2b191888ea","Type":"ContainerStarted","Data":"faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316"} Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.690667 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" event={"ID":"313bad0d-b020-4953-a14b-2f2b191888ea","Type":"ContainerStarted","Data":"a9a638312227fd384a3f42b68c4acdbc708ffc2896a7701b2cf7d52b75474a6a"} Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.690914 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.693268 4874 patch_prober.go:28] interesting pod/route-controller-manager-778677f976-wsdqp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.693315 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" podUID="313bad0d-b020-4953-a14b-2f2b191888ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Jan 26 00:12:46 crc kubenswrapper[4874]: I0126 00:12:46.712542 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" podStartSLOduration=2.712516598 podStartE2EDuration="2.712516598s" podCreationTimestamp="2026-01-26 00:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:12:46.709318446 +0000 UTC m=+341.443424983" watchObservedRunningTime="2026-01-26 00:12:46.712516598 +0000 UTC m=+341.446623145" Jan 26 00:12:47 crc kubenswrapper[4874]: I0126 00:12:47.629214 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="960d80ae-7893-4dde-a757-fb45b8725c8e" path="/var/lib/kubelet/pods/960d80ae-7893-4dde-a757-fb45b8725c8e/volumes" Jan 26 00:12:47 crc kubenswrapper[4874]: I0126 00:12:47.629936 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f" path="/var/lib/kubelet/pods/fabbdf80-9fdf-4e2c-aeef-5c0e7f09129f/volumes" Jan 26 00:12:47 crc kubenswrapper[4874]: I0126 00:12:47.700125 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:47 crc kubenswrapper[4874]: I0126 00:12:47.704500 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:12:47 crc kubenswrapper[4874]: I0126 00:12:47.704694 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:12:47 crc kubenswrapper[4874]: I0126 00:12:47.719625 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" podStartSLOduration=3.7196061030000003 podStartE2EDuration="3.719606103s" podCreationTimestamp="2026-01-26 00:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:12:47.716061174 +0000 UTC m=+342.450167731" watchObservedRunningTime="2026-01-26 00:12:47.719606103 +0000 UTC m=+342.453712640" Jan 26 00:13:20 crc kubenswrapper[4874]: I0126 00:13:20.508871 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qx779"] Jan 26 00:13:20 crc kubenswrapper[4874]: I0126 00:13:20.509660 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qx779" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="registry-server" containerID="cri-o://62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b" gracePeriod=2 Jan 26 00:13:20 crc kubenswrapper[4874]: I0126 00:13:20.704355 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llwxh"] Jan 26 00:13:20 crc kubenswrapper[4874]: I0126 00:13:20.704649 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-llwxh" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="registry-server" containerID="cri-o://d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca" gracePeriod=2 Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.053176 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca is running failed: container process not found" containerID="d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.053667 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca is running failed: container process not found" containerID="d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.054269 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca is running failed: container process not found" containerID="d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.054327 4874 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-llwxh" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="registry-server" Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.285929 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b is running failed: container process not found" containerID="62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.287480 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b is running failed: container process not found" containerID="62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.288186 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b is running failed: container process not found" containerID="62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:13:21 crc kubenswrapper[4874]: E0126 00:13:21.288289 4874 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-qx779" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="registry-server" Jan 26 00:13:22 crc kubenswrapper[4874]: I0126 00:13:22.444428 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:13:22 crc kubenswrapper[4874]: I0126 00:13:22.444535 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:13:22 crc kubenswrapper[4874]: I0126 00:13:22.909325 4874 generic.go:334] "Generic (PLEG): container finished" podID="ff351744-6cfe-455c-851d-c8e218efee0e" containerID="d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca" exitCode=0 Jan 26 00:13:22 crc kubenswrapper[4874]: I0126 00:13:22.909408 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llwxh" event={"ID":"ff351744-6cfe-455c-851d-c8e218efee0e","Type":"ContainerDied","Data":"d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca"} Jan 26 00:13:22 crc kubenswrapper[4874]: I0126 00:13:22.911651 4874 generic.go:334] "Generic (PLEG): container finished" podID="f11f67a6-370d-4591-bebd-152ad8e10326" containerID="62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b" exitCode=0 Jan 26 00:13:22 crc kubenswrapper[4874]: I0126 00:13:22.911694 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx779" event={"ID":"f11f67a6-370d-4591-bebd-152ad8e10326","Type":"ContainerDied","Data":"62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b"} Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.619132 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.779151 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-catalog-content\") pod \"f11f67a6-370d-4591-bebd-152ad8e10326\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.779250 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqdpz\" (UniqueName: \"kubernetes.io/projected/f11f67a6-370d-4591-bebd-152ad8e10326-kube-api-access-bqdpz\") pod \"f11f67a6-370d-4591-bebd-152ad8e10326\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.779270 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-utilities\") pod \"f11f67a6-370d-4591-bebd-152ad8e10326\" (UID: \"f11f67a6-370d-4591-bebd-152ad8e10326\") " Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.780826 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-utilities" (OuterVolumeSpecName: "utilities") pod "f11f67a6-370d-4591-bebd-152ad8e10326" (UID: "f11f67a6-370d-4591-bebd-152ad8e10326"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.796875 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11f67a6-370d-4591-bebd-152ad8e10326-kube-api-access-bqdpz" (OuterVolumeSpecName: "kube-api-access-bqdpz") pod "f11f67a6-370d-4591-bebd-152ad8e10326" (UID: "f11f67a6-370d-4591-bebd-152ad8e10326"). InnerVolumeSpecName "kube-api-access-bqdpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.828540 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.851842 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f11f67a6-370d-4591-bebd-152ad8e10326" (UID: "f11f67a6-370d-4591-bebd-152ad8e10326"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.880792 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.880832 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqdpz\" (UniqueName: \"kubernetes.io/projected/f11f67a6-370d-4591-bebd-152ad8e10326-kube-api-access-bqdpz\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.880845 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11f67a6-370d-4591-bebd-152ad8e10326-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.919083 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx779" event={"ID":"f11f67a6-370d-4591-bebd-152ad8e10326","Type":"ContainerDied","Data":"a8e9a1b1ca744d78a447b24983bfafb8db9bec320c01f4373a60a15a788749bc"} Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.919147 4874 scope.go:117] "RemoveContainer" containerID="62a159aaac16002d8a41d04a0fd2ad5ae08f192172b9b0127ca2bd6d7ce5311b" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.919145 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx779" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.922450 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llwxh" event={"ID":"ff351744-6cfe-455c-851d-c8e218efee0e","Type":"ContainerDied","Data":"9a4d1a0a7b66b4aca57010deebe6f4f4fa6827ce43b54eeda932f2a28d75b44a"} Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.922510 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llwxh" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.946263 4874 scope.go:117] "RemoveContainer" containerID="b503b86aa94507f76586e00396d696e2c0418beeba252d0af8eacad174cf898d" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.946421 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qx779"] Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.952269 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qx779"] Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.968004 4874 scope.go:117] "RemoveContainer" containerID="cd10a81d45966b3ca36847c3a9fd8dab2471d6070551eb4fd5fed74cac2cdfa5" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.981496 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-catalog-content\") pod \"ff351744-6cfe-455c-851d-c8e218efee0e\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.981563 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-utilities\") pod \"ff351744-6cfe-455c-851d-c8e218efee0e\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.981624 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtwgr\" (UniqueName: \"kubernetes.io/projected/ff351744-6cfe-455c-851d-c8e218efee0e-kube-api-access-mtwgr\") pod \"ff351744-6cfe-455c-851d-c8e218efee0e\" (UID: \"ff351744-6cfe-455c-851d-c8e218efee0e\") " Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.982357 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-utilities" (OuterVolumeSpecName: "utilities") pod "ff351744-6cfe-455c-851d-c8e218efee0e" (UID: "ff351744-6cfe-455c-851d-c8e218efee0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.982935 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.984306 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff351744-6cfe-455c-851d-c8e218efee0e-kube-api-access-mtwgr" (OuterVolumeSpecName: "kube-api-access-mtwgr") pod "ff351744-6cfe-455c-851d-c8e218efee0e" (UID: "ff351744-6cfe-455c-851d-c8e218efee0e"). InnerVolumeSpecName "kube-api-access-mtwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:23 crc kubenswrapper[4874]: I0126 00:13:23.986145 4874 scope.go:117] "RemoveContainer" containerID="d414683280fafedc649982559b803f5da1d349d0eee9cd845d94f33294fe93ca" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.001077 4874 scope.go:117] "RemoveContainer" containerID="b667c475e509ba98231fd329931e8726809210a69378495d72137dcb9528060e" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.015944 4874 scope.go:117] "RemoveContainer" containerID="d877e7285c966607ab2e3833349b1c93314219991a5a072336334f3a4a4e95f6" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.037425 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff351744-6cfe-455c-851d-c8e218efee0e" (UID: "ff351744-6cfe-455c-851d-c8e218efee0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.084626 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff351744-6cfe-455c-851d-c8e218efee0e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.084668 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtwgr\" (UniqueName: \"kubernetes.io/projected/ff351744-6cfe-455c-851d-c8e218efee0e-kube-api-access-mtwgr\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.252996 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llwxh"] Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.257026 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-llwxh"] Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.383141 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr"] Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.383446 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" podUID="2d1ecab2-53e4-407e-b14a-3b613ed74612" containerName="controller-manager" containerID="cri-o://2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca" gracePeriod=30 Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.394405 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp"] Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.394769 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" podUID="313bad0d-b020-4953-a14b-2f2b191888ea" containerName="route-controller-manager" containerID="cri-o://faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316" gracePeriod=30 Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.840866 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.844451 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.929663 4874 generic.go:334] "Generic (PLEG): container finished" podID="2d1ecab2-53e4-407e-b14a-3b613ed74612" containerID="2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca" exitCode=0 Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.929738 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" event={"ID":"2d1ecab2-53e4-407e-b14a-3b613ed74612","Type":"ContainerDied","Data":"2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca"} Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.929767 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" event={"ID":"2d1ecab2-53e4-407e-b14a-3b613ed74612","Type":"ContainerDied","Data":"adc1e8eed40bfc77f89b5781cead417ba60192c0ad87e8d25ab65ed950fa888b"} Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.929786 4874 scope.go:117] "RemoveContainer" containerID="2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.929887 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.933782 4874 generic.go:334] "Generic (PLEG): container finished" podID="313bad0d-b020-4953-a14b-2f2b191888ea" containerID="faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316" exitCode=0 Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.933821 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" event={"ID":"313bad0d-b020-4953-a14b-2f2b191888ea","Type":"ContainerDied","Data":"faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316"} Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.933844 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" event={"ID":"313bad0d-b020-4953-a14b-2f2b191888ea","Type":"ContainerDied","Data":"a9a638312227fd384a3f42b68c4acdbc708ffc2896a7701b2cf7d52b75474a6a"} Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.933856 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.946575 4874 scope.go:117] "RemoveContainer" containerID="2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca" Jan 26 00:13:24 crc kubenswrapper[4874]: E0126 00:13:24.947314 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca\": container with ID starting with 2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca not found: ID does not exist" containerID="2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.947365 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca"} err="failed to get container status \"2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca\": rpc error: code = NotFound desc = could not find container \"2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca\": container with ID starting with 2f9340b0fa3abee3452071a1953b01017f3b1014d1ed3995773587d793aa39ca not found: ID does not exist" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.947397 4874 scope.go:117] "RemoveContainer" containerID="faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.964860 4874 scope.go:117] "RemoveContainer" containerID="faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316" Jan 26 00:13:24 crc kubenswrapper[4874]: E0126 00:13:24.965409 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316\": container with ID starting with faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316 not found: ID does not exist" containerID="faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316" Jan 26 00:13:24 crc kubenswrapper[4874]: I0126 00:13:24.965440 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316"} err="failed to get container status \"faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316\": rpc error: code = NotFound desc = could not find container \"faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316\": container with ID starting with faa6cc32453d86e657013295b138fb1ad246265e0c6b1f2c1aff359fe7826316 not found: ID does not exist" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000657 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-proxy-ca-bundles\") pod \"2d1ecab2-53e4-407e-b14a-3b613ed74612\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000728 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313bad0d-b020-4953-a14b-2f2b191888ea-serving-cert\") pod \"313bad0d-b020-4953-a14b-2f2b191888ea\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000760 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-client-ca\") pod \"2d1ecab2-53e4-407e-b14a-3b613ed74612\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000806 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-config\") pod \"313bad0d-b020-4953-a14b-2f2b191888ea\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000869 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnzcq\" (UniqueName: \"kubernetes.io/projected/2d1ecab2-53e4-407e-b14a-3b613ed74612-kube-api-access-vnzcq\") pod \"2d1ecab2-53e4-407e-b14a-3b613ed74612\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000908 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-client-ca\") pod \"313bad0d-b020-4953-a14b-2f2b191888ea\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000943 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1ecab2-53e4-407e-b14a-3b613ed74612-serving-cert\") pod \"2d1ecab2-53e4-407e-b14a-3b613ed74612\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000980 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-config\") pod \"2d1ecab2-53e4-407e-b14a-3b613ed74612\" (UID: \"2d1ecab2-53e4-407e-b14a-3b613ed74612\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.000997 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zndzg\" (UniqueName: \"kubernetes.io/projected/313bad0d-b020-4953-a14b-2f2b191888ea-kube-api-access-zndzg\") pod \"313bad0d-b020-4953-a14b-2f2b191888ea\" (UID: \"313bad0d-b020-4953-a14b-2f2b191888ea\") " Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.002862 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-client-ca" (OuterVolumeSpecName: "client-ca") pod "313bad0d-b020-4953-a14b-2f2b191888ea" (UID: "313bad0d-b020-4953-a14b-2f2b191888ea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.002874 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d1ecab2-53e4-407e-b14a-3b613ed74612" (UID: "2d1ecab2-53e4-407e-b14a-3b613ed74612"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.002923 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2d1ecab2-53e4-407e-b14a-3b613ed74612" (UID: "2d1ecab2-53e4-407e-b14a-3b613ed74612"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.002932 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-config" (OuterVolumeSpecName: "config") pod "2d1ecab2-53e4-407e-b14a-3b613ed74612" (UID: "2d1ecab2-53e4-407e-b14a-3b613ed74612"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.003573 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-config" (OuterVolumeSpecName: "config") pod "313bad0d-b020-4953-a14b-2f2b191888ea" (UID: "313bad0d-b020-4953-a14b-2f2b191888ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.005234 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1ecab2-53e4-407e-b14a-3b613ed74612-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d1ecab2-53e4-407e-b14a-3b613ed74612" (UID: "2d1ecab2-53e4-407e-b14a-3b613ed74612"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.005313 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313bad0d-b020-4953-a14b-2f2b191888ea-kube-api-access-zndzg" (OuterVolumeSpecName: "kube-api-access-zndzg") pod "313bad0d-b020-4953-a14b-2f2b191888ea" (UID: "313bad0d-b020-4953-a14b-2f2b191888ea"). InnerVolumeSpecName "kube-api-access-zndzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.005418 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313bad0d-b020-4953-a14b-2f2b191888ea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "313bad0d-b020-4953-a14b-2f2b191888ea" (UID: "313bad0d-b020-4953-a14b-2f2b191888ea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.007560 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1ecab2-53e4-407e-b14a-3b613ed74612-kube-api-access-vnzcq" (OuterVolumeSpecName: "kube-api-access-vnzcq") pod "2d1ecab2-53e4-407e-b14a-3b613ed74612" (UID: "2d1ecab2-53e4-407e-b14a-3b613ed74612"). InnerVolumeSpecName "kube-api-access-vnzcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102529 4874 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-client-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102586 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1ecab2-53e4-407e-b14a-3b613ed74612-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102596 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102604 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zndzg\" (UniqueName: \"kubernetes.io/projected/313bad0d-b020-4953-a14b-2f2b191888ea-kube-api-access-zndzg\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102613 4874 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102623 4874 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313bad0d-b020-4953-a14b-2f2b191888ea-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102629 4874 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d1ecab2-53e4-407e-b14a-3b613ed74612-client-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102637 4874 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313bad0d-b020-4953-a14b-2f2b191888ea-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.102645 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnzcq\" (UniqueName: \"kubernetes.io/projected/2d1ecab2-53e4-407e-b14a-3b613ed74612-kube-api-access-vnzcq\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.270027 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr"] Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.270579 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-52xsr"] Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.279902 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp"] Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.289026 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-wsdqp"] Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.628109 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1ecab2-53e4-407e-b14a-3b613ed74612" path="/var/lib/kubelet/pods/2d1ecab2-53e4-407e-b14a-3b613ed74612/volumes" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.628780 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313bad0d-b020-4953-a14b-2f2b191888ea" path="/var/lib/kubelet/pods/313bad0d-b020-4953-a14b-2f2b191888ea/volumes" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.629409 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" path="/var/lib/kubelet/pods/f11f67a6-370d-4591-bebd-152ad8e10326/volumes" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.631133 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" path="/var/lib/kubelet/pods/ff351744-6cfe-455c-851d-c8e218efee0e/volumes" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848317 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns"] Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848679 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="extract-content" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848702 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="extract-content" Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848720 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="registry-server" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848731 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="registry-server" Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848749 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="extract-utilities" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848761 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="extract-utilities" Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848781 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="extract-content" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848792 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="extract-content" Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848805 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313bad0d-b020-4953-a14b-2f2b191888ea" containerName="route-controller-manager" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848815 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="313bad0d-b020-4953-a14b-2f2b191888ea" containerName="route-controller-manager" Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848832 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1ecab2-53e4-407e-b14a-3b613ed74612" containerName="controller-manager" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848843 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1ecab2-53e4-407e-b14a-3b613ed74612" containerName="controller-manager" Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848860 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="registry-server" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848871 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="registry-server" Jan 26 00:13:25 crc kubenswrapper[4874]: E0126 00:13:25.848887 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="extract-utilities" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.848897 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="extract-utilities" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.849105 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="313bad0d-b020-4953-a14b-2f2b191888ea" containerName="route-controller-manager" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.849125 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1ecab2-53e4-407e-b14a-3b613ed74612" containerName="controller-manager" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.849142 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11f67a6-370d-4591-bebd-152ad8e10326" containerName="registry-server" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.849159 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff351744-6cfe-455c-851d-c8e218efee0e" containerName="registry-server" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.849766 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.851479 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.851881 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.851908 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.852129 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.852563 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.853641 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.856626 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-576c86f7f8-vtds4"] Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.857833 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.862111 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.862363 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.862651 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.864447 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.864636 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.865045 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.869827 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.871221 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns"] Jan 26 00:13:25 crc kubenswrapper[4874]: I0126 00:13:25.875051 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-576c86f7f8-vtds4"] Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014284 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c39ab01-f725-443d-b5c5-1a3cf8947b44-serving-cert\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014343 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-serving-cert\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014376 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-client-ca\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014447 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-proxy-ca-bundles\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014487 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c39ab01-f725-443d-b5c5-1a3cf8947b44-client-ca\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014527 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5x5g\" (UniqueName: \"kubernetes.io/projected/6c39ab01-f725-443d-b5c5-1a3cf8947b44-kube-api-access-c5x5g\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014603 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39ab01-f725-443d-b5c5-1a3cf8947b44-config\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014642 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n26ff\" (UniqueName: \"kubernetes.io/projected/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-kube-api-access-n26ff\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.014662 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-config\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116128 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-serving-cert\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116193 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-client-ca\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116236 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-proxy-ca-bundles\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116277 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c39ab01-f725-443d-b5c5-1a3cf8947b44-client-ca\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116305 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5x5g\" (UniqueName: \"kubernetes.io/projected/6c39ab01-f725-443d-b5c5-1a3cf8947b44-kube-api-access-c5x5g\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116331 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39ab01-f725-443d-b5c5-1a3cf8947b44-config\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116362 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-config\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116381 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n26ff\" (UniqueName: \"kubernetes.io/projected/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-kube-api-access-n26ff\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.116406 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c39ab01-f725-443d-b5c5-1a3cf8947b44-serving-cert\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.118753 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-client-ca\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.118993 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39ab01-f725-443d-b5c5-1a3cf8947b44-config\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.119832 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-proxy-ca-bundles\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.120186 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c39ab01-f725-443d-b5c5-1a3cf8947b44-client-ca\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.123851 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-config\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.131744 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c39ab01-f725-443d-b5c5-1a3cf8947b44-serving-cert\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.137607 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-serving-cert\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.146426 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n26ff\" (UniqueName: \"kubernetes.io/projected/e181bd05-d126-46eb-8bbf-5bbff4d8ffe0-kube-api-access-n26ff\") pod \"controller-manager-576c86f7f8-vtds4\" (UID: \"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0\") " pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.149410 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5x5g\" (UniqueName: \"kubernetes.io/projected/6c39ab01-f725-443d-b5c5-1a3cf8947b44-kube-api-access-c5x5g\") pod \"route-controller-manager-6bbb7fdb9-qkjns\" (UID: \"6c39ab01-f725-443d-b5c5-1a3cf8947b44\") " pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.199526 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.206953 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.872000 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-576c86f7f8-vtds4"] Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.876342 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns"] Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.953021 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" event={"ID":"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0","Type":"ContainerStarted","Data":"949ff850a6dacd5a752a8c3c202b6964bc8ce261c241058ad06652252bd5b48d"} Jan 26 00:13:26 crc kubenswrapper[4874]: I0126 00:13:26.954394 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" event={"ID":"6c39ab01-f725-443d-b5c5-1a3cf8947b44","Type":"ContainerStarted","Data":"d1c0cdf2e5dc2e734955559bada2ecb56e3180ac40fc17421cc0f2e876639fd5"} Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.777386 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m5fbw"] Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.778877 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.792665 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m5fbw"] Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.943744 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-registry-tls\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.943801 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.943846 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2srg\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-kube-api-access-g2srg\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.943896 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-registry-certificates\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.943934 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-trusted-ca\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.943954 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-bound-sa-token\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.944001 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.944067 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.967805 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" event={"ID":"6c39ab01-f725-443d-b5c5-1a3cf8947b44","Type":"ContainerStarted","Data":"652188ff05535d426388e934288f1b4137155a0a00c15a94f232b32c58ea244e"} Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.968247 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.971438 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" event={"ID":"e181bd05-d126-46eb-8bbf-5bbff4d8ffe0","Type":"ContainerStarted","Data":"c08b5cce07631b2f0ffb03cc9d13693a93d2d0112cfbfa8f43d27f4985b79027"} Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.971712 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.973746 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.975711 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.977211 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" Jan 26 00:13:27 crc kubenswrapper[4874]: I0126 00:13:27.988662 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bbb7fdb9-qkjns" podStartSLOduration=3.988625715 podStartE2EDuration="3.988625715s" podCreationTimestamp="2026-01-26 00:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:13:27.98408912 +0000 UTC m=+382.718195677" watchObservedRunningTime="2026-01-26 00:13:27.988625715 +0000 UTC m=+382.722732262" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.022603 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-576c86f7f8-vtds4" podStartSLOduration=4.022574449 podStartE2EDuration="4.022574449s" podCreationTimestamp="2026-01-26 00:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:13:28.00480773 +0000 UTC m=+382.738914267" watchObservedRunningTime="2026-01-26 00:13:28.022574449 +0000 UTC m=+382.756680986" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.044811 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-registry-tls\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.044868 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.044893 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2srg\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-kube-api-access-g2srg\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.044934 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-registry-certificates\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.044990 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-trusted-ca\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.045007 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-bound-sa-token\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.045028 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.045782 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.047190 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-registry-certificates\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.048247 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-trusted-ca\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.052444 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-registry-tls\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.063696 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-bound-sa-token\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.065923 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.075588 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2srg\" (UniqueName: \"kubernetes.io/projected/21faf9fb-9d04-4f91-8938-1745cb9ab3bc-kube-api-access-g2srg\") pod \"image-registry-66df7c8f76-m5fbw\" (UID: \"21faf9fb-9d04-4f91-8938-1745cb9ab3bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.099369 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.498063 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m5fbw"] Jan 26 00:13:28 crc kubenswrapper[4874]: W0126 00:13:28.512851 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21faf9fb_9d04_4f91_8938_1745cb9ab3bc.slice/crio-c1b9f2c110c97a96a243eb4c5ed1c5da2ec5336da58d2e85dbc6546a6813f1f7 WatchSource:0}: Error finding container c1b9f2c110c97a96a243eb4c5ed1c5da2ec5336da58d2e85dbc6546a6813f1f7: Status 404 returned error can't find the container with id c1b9f2c110c97a96a243eb4c5ed1c5da2ec5336da58d2e85dbc6546a6813f1f7 Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.987034 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" event={"ID":"21faf9fb-9d04-4f91-8938-1745cb9ab3bc","Type":"ContainerStarted","Data":"5b7f8e079c23e2b4d72def46406191d73dc290a1afe2beaf125bb027cd92cc85"} Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.987676 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:28 crc kubenswrapper[4874]: I0126 00:13:28.987702 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" event={"ID":"21faf9fb-9d04-4f91-8938-1745cb9ab3bc","Type":"ContainerStarted","Data":"c1b9f2c110c97a96a243eb4c5ed1c5da2ec5336da58d2e85dbc6546a6813f1f7"} Jan 26 00:13:29 crc kubenswrapper[4874]: I0126 00:13:29.013206 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" podStartSLOduration=2.013176495 podStartE2EDuration="2.013176495s" podCreationTimestamp="2026-01-26 00:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:13:29.010909293 +0000 UTC m=+383.745015870" watchObservedRunningTime="2026-01-26 00:13:29.013176495 +0000 UTC m=+383.747283032" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.677044 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ps2v4"] Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.677950 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ps2v4" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="registry-server" containerID="cri-o://1bb5f4804d8eb0fc30a130467d8bdad5af0a68675807b7ec93cb68d904be4306" gracePeriod=30 Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.679015 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4fgs"] Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.679274 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r4fgs" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="registry-server" containerID="cri-o://e43e8b14ab04e71647f442ce65934eb39bce4ce4b23631584ac981674894af72" gracePeriod=30 Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.690497 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8ffn"] Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.690738 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" containerID="cri-o://2fca9c432c38d8d0a50bbff06297e1234e088668ed43513c7b40288310fb8a3e" gracePeriod=30 Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.702447 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2dns"] Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.702756 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s2dns" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="registry-server" containerID="cri-o://1e8560cdbf43be5cf7a9c86479cab3ac6acaeb6f2cab6a337f3c89feeb33e591" gracePeriod=30 Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.717307 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdrl5"] Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.717678 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bdrl5" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="registry-server" containerID="cri-o://763afb54b3bfddcbb7c1525a40752f55111a0622eeb9d7d7b188f8313b70a4cf" gracePeriod=30 Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.723513 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qbtnf"] Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.724405 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.731188 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qbtnf"] Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.781792 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5724916b-98d3-4a48-8da3-b81427657599-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.781911 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5724916b-98d3-4a48-8da3-b81427657599-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.781947 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdk2p\" (UniqueName: \"kubernetes.io/projected/5724916b-98d3-4a48-8da3-b81427657599-kube-api-access-jdk2p\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.883205 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5724916b-98d3-4a48-8da3-b81427657599-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.883271 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdk2p\" (UniqueName: \"kubernetes.io/projected/5724916b-98d3-4a48-8da3-b81427657599-kube-api-access-jdk2p\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.883306 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5724916b-98d3-4a48-8da3-b81427657599-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.884901 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5724916b-98d3-4a48-8da3-b81427657599-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.893122 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5724916b-98d3-4a48-8da3-b81427657599-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:34 crc kubenswrapper[4874]: I0126 00:13:34.905829 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdk2p\" (UniqueName: \"kubernetes.io/projected/5724916b-98d3-4a48-8da3-b81427657599-kube-api-access-jdk2p\") pod \"marketplace-operator-79b997595-qbtnf\" (UID: \"5724916b-98d3-4a48-8da3-b81427657599\") " pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:35 crc kubenswrapper[4874]: I0126 00:13:35.057085 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:35 crc kubenswrapper[4874]: I0126 00:13:35.474773 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qbtnf"] Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.036733 4874 generic.go:334] "Generic (PLEG): container finished" podID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerID="2fca9c432c38d8d0a50bbff06297e1234e088668ed43513c7b40288310fb8a3e" exitCode=0 Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.037058 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" event={"ID":"8aa0d0bc-1d01-402d-a9aa-97a984717116","Type":"ContainerDied","Data":"2fca9c432c38d8d0a50bbff06297e1234e088668ed43513c7b40288310fb8a3e"} Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.037118 4874 scope.go:117] "RemoveContainer" containerID="334b5aea9863f53fcc5738bcac353ee91b6959e47bdc154b420c1f8104992b4e" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.040644 4874 generic.go:334] "Generic (PLEG): container finished" podID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerID="1e8560cdbf43be5cf7a9c86479cab3ac6acaeb6f2cab6a337f3c89feeb33e591" exitCode=0 Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.040697 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2dns" event={"ID":"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5","Type":"ContainerDied","Data":"1e8560cdbf43be5cf7a9c86479cab3ac6acaeb6f2cab6a337f3c89feeb33e591"} Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.042148 4874 generic.go:334] "Generic (PLEG): container finished" podID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerID="1bb5f4804d8eb0fc30a130467d8bdad5af0a68675807b7ec93cb68d904be4306" exitCode=0 Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.042190 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ps2v4" event={"ID":"259aa26b-333f-4b21-8bae-8494dbc94b50","Type":"ContainerDied","Data":"1bb5f4804d8eb0fc30a130467d8bdad5af0a68675807b7ec93cb68d904be4306"} Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.043001 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" event={"ID":"5724916b-98d3-4a48-8da3-b81427657599","Type":"ContainerStarted","Data":"8b789a6788193d6ffbf33aff83c77411c758bac59cc4e87b91cb80c9a3eecb68"} Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.043025 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" event={"ID":"5724916b-98d3-4a48-8da3-b81427657599","Type":"ContainerStarted","Data":"4d4497333d0fcba0b1988457172aff08ceca4163e26659a817730f2e33612f1c"} Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.044137 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.061659 4874 generic.go:334] "Generic (PLEG): container finished" podID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerID="763afb54b3bfddcbb7c1525a40752f55111a0622eeb9d7d7b188f8313b70a4cf" exitCode=0 Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.061774 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdrl5" event={"ID":"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7","Type":"ContainerDied","Data":"763afb54b3bfddcbb7c1525a40752f55111a0622eeb9d7d7b188f8313b70a4cf"} Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.061893 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.065511 4874 generic.go:334] "Generic (PLEG): container finished" podID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerID="e43e8b14ab04e71647f442ce65934eb39bce4ce4b23631584ac981674894af72" exitCode=0 Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.065546 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fgs" event={"ID":"87afa5b5-ce89-41ab-8fe6-cb82acb882aa","Type":"ContainerDied","Data":"e43e8b14ab04e71647f442ce65934eb39bce4ce4b23631584ac981674894af72"} Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.096360 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qbtnf" podStartSLOduration=2.096344042 podStartE2EDuration="2.096344042s" podCreationTimestamp="2026-01-26 00:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:13:36.076514357 +0000 UTC m=+390.810620894" watchObservedRunningTime="2026-01-26 00:13:36.096344042 +0000 UTC m=+390.830450579" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.225295 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.401126 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcfx9\" (UniqueName: \"kubernetes.io/projected/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-kube-api-access-gcfx9\") pod \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.401453 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-utilities\") pod \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.401493 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-catalog-content\") pod \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\" (UID: \"87afa5b5-ce89-41ab-8fe6-cb82acb882aa\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.405748 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-utilities" (OuterVolumeSpecName: "utilities") pod "87afa5b5-ce89-41ab-8fe6-cb82acb882aa" (UID: "87afa5b5-ce89-41ab-8fe6-cb82acb882aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.420291 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-kube-api-access-gcfx9" (OuterVolumeSpecName: "kube-api-access-gcfx9") pod "87afa5b5-ce89-41ab-8fe6-cb82acb882aa" (UID: "87afa5b5-ce89-41ab-8fe6-cb82acb882aa"). InnerVolumeSpecName "kube-api-access-gcfx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.491294 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.494732 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.496868 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.509611 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.509645 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcfx9\" (UniqueName: \"kubernetes.io/projected/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-kube-api-access-gcfx9\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.536611 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87afa5b5-ce89-41ab-8fe6-cb82acb882aa" (UID: "87afa5b5-ce89-41ab-8fe6-cb82acb882aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.595847 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.610185 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-catalog-content\") pod \"259aa26b-333f-4b21-8bae-8494dbc94b50\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.610222 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-catalog-content\") pod \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.610334 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-utilities\") pod \"259aa26b-333f-4b21-8bae-8494dbc94b50\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611163 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-utilities" (OuterVolumeSpecName: "utilities") pod "259aa26b-333f-4b21-8bae-8494dbc94b50" (UID: "259aa26b-333f-4b21-8bae-8494dbc94b50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.610366 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxxv7\" (UniqueName: \"kubernetes.io/projected/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-kube-api-access-xxxv7\") pod \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611800 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-utilities\") pod \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611825 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56c6b\" (UniqueName: \"kubernetes.io/projected/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-kube-api-access-56c6b\") pod \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611844 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpqtk\" (UniqueName: \"kubernetes.io/projected/8aa0d0bc-1d01-402d-a9aa-97a984717116-kube-api-access-hpqtk\") pod \"8aa0d0bc-1d01-402d-a9aa-97a984717116\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611860 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-utilities\") pod \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\" (UID: \"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611883 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-catalog-content\") pod \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\" (UID: \"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611903 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8jmh\" (UniqueName: \"kubernetes.io/projected/259aa26b-333f-4b21-8bae-8494dbc94b50-kube-api-access-k8jmh\") pod \"259aa26b-333f-4b21-8bae-8494dbc94b50\" (UID: \"259aa26b-333f-4b21-8bae-8494dbc94b50\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.611946 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-operator-metrics\") pod \"8aa0d0bc-1d01-402d-a9aa-97a984717116\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.612142 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.612163 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87afa5b5-ce89-41ab-8fe6-cb82acb882aa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.613217 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-utilities" (OuterVolumeSpecName: "utilities") pod "3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" (UID: "3a28b4a4-b5c7-4f8f-9577-a65d9148efb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.613836 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-utilities" (OuterVolumeSpecName: "utilities") pod "bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" (UID: "bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.615213 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-kube-api-access-xxxv7" (OuterVolumeSpecName: "kube-api-access-xxxv7") pod "bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" (UID: "bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7"). InnerVolumeSpecName "kube-api-access-xxxv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.617261 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-kube-api-access-56c6b" (OuterVolumeSpecName: "kube-api-access-56c6b") pod "3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" (UID: "3a28b4a4-b5c7-4f8f-9577-a65d9148efb5"). InnerVolumeSpecName "kube-api-access-56c6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.617985 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8aa0d0bc-1d01-402d-a9aa-97a984717116" (UID: "8aa0d0bc-1d01-402d-a9aa-97a984717116"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.618197 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa0d0bc-1d01-402d-a9aa-97a984717116-kube-api-access-hpqtk" (OuterVolumeSpecName: "kube-api-access-hpqtk") pod "8aa0d0bc-1d01-402d-a9aa-97a984717116" (UID: "8aa0d0bc-1d01-402d-a9aa-97a984717116"). InnerVolumeSpecName "kube-api-access-hpqtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.623016 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259aa26b-333f-4b21-8bae-8494dbc94b50-kube-api-access-k8jmh" (OuterVolumeSpecName: "kube-api-access-k8jmh") pod "259aa26b-333f-4b21-8bae-8494dbc94b50" (UID: "259aa26b-333f-4b21-8bae-8494dbc94b50"). InnerVolumeSpecName "kube-api-access-k8jmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.642690 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" (UID: "3a28b4a4-b5c7-4f8f-9577-a65d9148efb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.659429 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "259aa26b-333f-4b21-8bae-8494dbc94b50" (UID: "259aa26b-333f-4b21-8bae-8494dbc94b50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.712669 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-trusted-ca\") pod \"8aa0d0bc-1d01-402d-a9aa-97a984717116\" (UID: \"8aa0d0bc-1d01-402d-a9aa-97a984717116\") " Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713029 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259aa26b-333f-4b21-8bae-8494dbc94b50-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713045 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxxv7\" (UniqueName: \"kubernetes.io/projected/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-kube-api-access-xxxv7\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713058 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713067 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56c6b\" (UniqueName: \"kubernetes.io/projected/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-kube-api-access-56c6b\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713076 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpqtk\" (UniqueName: \"kubernetes.io/projected/8aa0d0bc-1d01-402d-a9aa-97a984717116-kube-api-access-hpqtk\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713084 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713093 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713101 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8jmh\" (UniqueName: \"kubernetes.io/projected/259aa26b-333f-4b21-8bae-8494dbc94b50-kube-api-access-k8jmh\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713109 4874 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.713551 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8aa0d0bc-1d01-402d-a9aa-97a984717116" (UID: "8aa0d0bc-1d01-402d-a9aa-97a984717116"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.761689 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" (UID: "bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.814042 4874 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aa0d0bc-1d01-402d-a9aa-97a984717116-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:36 crc kubenswrapper[4874]: I0126 00:13:36.814082 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.074236 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" event={"ID":"8aa0d0bc-1d01-402d-a9aa-97a984717116","Type":"ContainerDied","Data":"6e97477674a8683fa399a0f2fdaa72a70bcd18eac2f3a675d5c59b9f0838e7ad"} Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.074251 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8ffn" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.074294 4874 scope.go:117] "RemoveContainer" containerID="2fca9c432c38d8d0a50bbff06297e1234e088668ed43513c7b40288310fb8a3e" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.079101 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2dns" event={"ID":"3a28b4a4-b5c7-4f8f-9577-a65d9148efb5","Type":"ContainerDied","Data":"774c82869774b1bfbcc58a500b4be436c66dcbfe975967271f0cb6a3c7f7f663"} Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.079201 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2dns" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.085706 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ps2v4" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.085708 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ps2v4" event={"ID":"259aa26b-333f-4b21-8bae-8494dbc94b50","Type":"ContainerDied","Data":"e9bb97a55c863a7c5865a423da09e85b519c309c916cb2171f0bda103e6ed9e3"} Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.091508 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdrl5" event={"ID":"bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7","Type":"ContainerDied","Data":"a50314309f7a7fe466d7ed00fdde026d0ee33227802b2223e7673a038fae429b"} Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.091573 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdrl5" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.098660 4874 scope.go:117] "RemoveContainer" containerID="1e8560cdbf43be5cf7a9c86479cab3ac6acaeb6f2cab6a337f3c89feeb33e591" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.101162 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fgs" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.101330 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fgs" event={"ID":"87afa5b5-ce89-41ab-8fe6-cb82acb882aa","Type":"ContainerDied","Data":"5af821e5864d2182b0a3e1878b69ed7d168270ad6aaca564b3447c252da847fa"} Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.114033 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2dns"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.116910 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2dns"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.123533 4874 scope.go:117] "RemoveContainer" containerID="09264b002ad2506a6c12a204398f5ae5f95f6657e88d8e1839b41f1f0378b145" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.129927 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8ffn"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.137627 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8ffn"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.164356 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdrl5"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.167288 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bdrl5"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.189777 4874 scope.go:117] "RemoveContainer" containerID="2c77ceb39ee7dcfd56f9edf137cfa7bc8aaa61e168d9f826089daab3586ddbf8" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.191006 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ps2v4"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.199386 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ps2v4"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.204918 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4fgs"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.205542 4874 scope.go:117] "RemoveContainer" containerID="1bb5f4804d8eb0fc30a130467d8bdad5af0a68675807b7ec93cb68d904be4306" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.209586 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r4fgs"] Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.233281 4874 scope.go:117] "RemoveContainer" containerID="1fc982f7b99f1749518ced00e81090470938a5623777f0bd068e8670797dbd56" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.252833 4874 scope.go:117] "RemoveContainer" containerID="fa18429a90e672674b8b6627b229105d2fdec8d60b7e83af4abae8c62274d903" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.273343 4874 scope.go:117] "RemoveContainer" containerID="763afb54b3bfddcbb7c1525a40752f55111a0622eeb9d7d7b188f8313b70a4cf" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.289839 4874 scope.go:117] "RemoveContainer" containerID="8e3c81dea0ab72c723aa3cb0dc15a2079a5fb81a547f949caa9e4d7d0350a13d" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.307336 4874 scope.go:117] "RemoveContainer" containerID="c96e65ad44fe7b35b346683fa52eb2210d29a75718982663c3314db1a338dc1b" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.322263 4874 scope.go:117] "RemoveContainer" containerID="e43e8b14ab04e71647f442ce65934eb39bce4ce4b23631584ac981674894af72" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.339892 4874 scope.go:117] "RemoveContainer" containerID="5014dbffb268b6b8b3aa50f0d50238d316b0ae689a8b38591cee56e964859b8e" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.356382 4874 scope.go:117] "RemoveContainer" containerID="d3ee6f5c7b6505d9082eb619b62ffd91443f40d4f302bbb3d4affb339e36a127" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.629886 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" path="/var/lib/kubelet/pods/259aa26b-333f-4b21-8bae-8494dbc94b50/volumes" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.632300 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" path="/var/lib/kubelet/pods/3a28b4a4-b5c7-4f8f-9577-a65d9148efb5/volumes" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.633670 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" path="/var/lib/kubelet/pods/87afa5b5-ce89-41ab-8fe6-cb82acb882aa/volumes" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.635328 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" path="/var/lib/kubelet/pods/8aa0d0bc-1d01-402d-a9aa-97a984717116/volumes" Jan 26 00:13:37 crc kubenswrapper[4874]: I0126 00:13:37.636061 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" path="/var/lib/kubelet/pods/bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7/volumes" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.890409 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g84nj"] Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.890977 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.890993 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891009 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891017 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891030 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891038 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891049 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891057 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891069 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891076 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891084 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891093 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891100 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891106 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891119 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891127 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891136 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891143 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891151 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891159 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891168 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891176 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891188 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891194 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="extract-utilities" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891204 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891210 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="extract-content" Jan 26 00:13:38 crc kubenswrapper[4874]: E0126 00:13:38.891222 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891229 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891335 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="87afa5b5-ce89-41ab-8fe6-cb82acb882aa" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891349 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a28b4a4-b5c7-4f8f-9577-a65d9148efb5" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891365 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde4fdb5-abe2-43bf-bf7d-8460bb1a4ea7" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891372 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891383 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa0d0bc-1d01-402d-a9aa-97a984717116" containerName="marketplace-operator" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.891391 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="259aa26b-333f-4b21-8bae-8494dbc94b50" containerName="registry-server" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.892823 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.895183 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 26 00:13:38 crc kubenswrapper[4874]: I0126 00:13:38.897567 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g84nj"] Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.045340 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eec2d818-be5e-4803-9725-bb1565d2ba99-utilities\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.045518 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eec2d818-be5e-4803-9725-bb1565d2ba99-catalog-content\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.045581 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrfd2\" (UniqueName: \"kubernetes.io/projected/eec2d818-be5e-4803-9725-bb1565d2ba99-kube-api-access-xrfd2\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.083464 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r9jch"] Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.084632 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.087388 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.093120 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9jch"] Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.146779 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eec2d818-be5e-4803-9725-bb1565d2ba99-utilities\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.146846 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eec2d818-be5e-4803-9725-bb1565d2ba99-catalog-content\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.146868 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrfd2\" (UniqueName: \"kubernetes.io/projected/eec2d818-be5e-4803-9725-bb1565d2ba99-kube-api-access-xrfd2\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.147613 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eec2d818-be5e-4803-9725-bb1565d2ba99-catalog-content\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.147670 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eec2d818-be5e-4803-9725-bb1565d2ba99-utilities\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.166246 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrfd2\" (UniqueName: \"kubernetes.io/projected/eec2d818-be5e-4803-9725-bb1565d2ba99-kube-api-access-xrfd2\") pod \"certified-operators-g84nj\" (UID: \"eec2d818-be5e-4803-9725-bb1565d2ba99\") " pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.209091 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.248744 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jznh\" (UniqueName: \"kubernetes.io/projected/5432e358-16b1-407d-9b0a-e07fa48e6a0d-kube-api-access-4jznh\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.248822 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5432e358-16b1-407d-9b0a-e07fa48e6a0d-utilities\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.248937 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5432e358-16b1-407d-9b0a-e07fa48e6a0d-catalog-content\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.352434 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5432e358-16b1-407d-9b0a-e07fa48e6a0d-utilities\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.353029 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5432e358-16b1-407d-9b0a-e07fa48e6a0d-catalog-content\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.353127 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jznh\" (UniqueName: \"kubernetes.io/projected/5432e358-16b1-407d-9b0a-e07fa48e6a0d-kube-api-access-4jznh\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.353177 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5432e358-16b1-407d-9b0a-e07fa48e6a0d-utilities\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.353512 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5432e358-16b1-407d-9b0a-e07fa48e6a0d-catalog-content\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.373035 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jznh\" (UniqueName: \"kubernetes.io/projected/5432e358-16b1-407d-9b0a-e07fa48e6a0d-kube-api-access-4jznh\") pod \"community-operators-r9jch\" (UID: \"5432e358-16b1-407d-9b0a-e07fa48e6a0d\") " pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.416108 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.666178 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g84nj"] Jan 26 00:13:39 crc kubenswrapper[4874]: W0126 00:13:39.668489 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec2d818_be5e_4803_9725_bb1565d2ba99.slice/crio-2f8b4753bfdbd23b4ad23eff01fa55e1688b430a182d0b5bb94135624ac8867a WatchSource:0}: Error finding container 2f8b4753bfdbd23b4ad23eff01fa55e1688b430a182d0b5bb94135624ac8867a: Status 404 returned error can't find the container with id 2f8b4753bfdbd23b4ad23eff01fa55e1688b430a182d0b5bb94135624ac8867a Jan 26 00:13:39 crc kubenswrapper[4874]: I0126 00:13:39.791503 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9jch"] Jan 26 00:13:39 crc kubenswrapper[4874]: W0126 00:13:39.794700 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5432e358_16b1_407d_9b0a_e07fa48e6a0d.slice/crio-c0dcf650a859587faf3bfc9906793b267685d3806af326ca37f2f96d5c8843f6 WatchSource:0}: Error finding container c0dcf650a859587faf3bfc9906793b267685d3806af326ca37f2f96d5c8843f6: Status 404 returned error can't find the container with id c0dcf650a859587faf3bfc9906793b267685d3806af326ca37f2f96d5c8843f6 Jan 26 00:13:40 crc kubenswrapper[4874]: I0126 00:13:40.138469 4874 generic.go:334] "Generic (PLEG): container finished" podID="eec2d818-be5e-4803-9725-bb1565d2ba99" containerID="01cfb4aee76f74a21052a5f0efcf8b2ff42e53791b00c752b16c1842663ee555" exitCode=0 Jan 26 00:13:40 crc kubenswrapper[4874]: I0126 00:13:40.138580 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g84nj" event={"ID":"eec2d818-be5e-4803-9725-bb1565d2ba99","Type":"ContainerDied","Data":"01cfb4aee76f74a21052a5f0efcf8b2ff42e53791b00c752b16c1842663ee555"} Jan 26 00:13:40 crc kubenswrapper[4874]: I0126 00:13:40.138624 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g84nj" event={"ID":"eec2d818-be5e-4803-9725-bb1565d2ba99","Type":"ContainerStarted","Data":"2f8b4753bfdbd23b4ad23eff01fa55e1688b430a182d0b5bb94135624ac8867a"} Jan 26 00:13:40 crc kubenswrapper[4874]: I0126 00:13:40.140314 4874 generic.go:334] "Generic (PLEG): container finished" podID="5432e358-16b1-407d-9b0a-e07fa48e6a0d" containerID="5085e803367cc16e1e15ceeb290c436f4abf5e421ac99575b67b34caac8a579f" exitCode=0 Jan 26 00:13:40 crc kubenswrapper[4874]: I0126 00:13:40.140340 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9jch" event={"ID":"5432e358-16b1-407d-9b0a-e07fa48e6a0d","Type":"ContainerDied","Data":"5085e803367cc16e1e15ceeb290c436f4abf5e421ac99575b67b34caac8a579f"} Jan 26 00:13:40 crc kubenswrapper[4874]: I0126 00:13:40.140356 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9jch" event={"ID":"5432e358-16b1-407d-9b0a-e07fa48e6a0d","Type":"ContainerStarted","Data":"c0dcf650a859587faf3bfc9906793b267685d3806af326ca37f2f96d5c8843f6"} Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.292488 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pmxgk"] Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.295889 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.298120 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.298512 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmxgk"] Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.483205 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-utilities\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.483259 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj5pj\" (UniqueName: \"kubernetes.io/projected/c73e816b-0d86-414e-896d-7e4b2d0f79ac-kube-api-access-mj5pj\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.483302 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-catalog-content\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.483855 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbqx6"] Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.485140 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.487318 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.496050 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbqx6"] Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.584769 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4463e26e-6510-4fe5-8e96-c3035234bd88-catalog-content\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.585097 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-utilities\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.585124 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4463e26e-6510-4fe5-8e96-c3035234bd88-utilities\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.585151 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj5pj\" (UniqueName: \"kubernetes.io/projected/c73e816b-0d86-414e-896d-7e4b2d0f79ac-kube-api-access-mj5pj\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.585184 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-catalog-content\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.585388 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8xpw\" (UniqueName: \"kubernetes.io/projected/4463e26e-6510-4fe5-8e96-c3035234bd88-kube-api-access-z8xpw\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.585556 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-catalog-content\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.585676 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-utilities\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.604803 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj5pj\" (UniqueName: \"kubernetes.io/projected/c73e816b-0d86-414e-896d-7e4b2d0f79ac-kube-api-access-mj5pj\") pod \"redhat-marketplace-pmxgk\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.637589 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.686600 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4463e26e-6510-4fe5-8e96-c3035234bd88-utilities\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.686691 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8xpw\" (UniqueName: \"kubernetes.io/projected/4463e26e-6510-4fe5-8e96-c3035234bd88-kube-api-access-z8xpw\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.686744 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4463e26e-6510-4fe5-8e96-c3035234bd88-catalog-content\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.687159 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4463e26e-6510-4fe5-8e96-c3035234bd88-utilities\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.687186 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4463e26e-6510-4fe5-8e96-c3035234bd88-catalog-content\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.705426 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8xpw\" (UniqueName: \"kubernetes.io/projected/4463e26e-6510-4fe5-8e96-c3035234bd88-kube-api-access-z8xpw\") pod \"redhat-operators-qbqx6\" (UID: \"4463e26e-6510-4fe5-8e96-c3035234bd88\") " pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:41 crc kubenswrapper[4874]: I0126 00:13:41.803199 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.031901 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmxgk"] Jan 26 00:13:42 crc kubenswrapper[4874]: W0126 00:13:42.041083 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc73e816b_0d86_414e_896d_7e4b2d0f79ac.slice/crio-c3062dea231c2fc1be13c4bb3d21328a3cd4327fe45afe7897a25923d21bc3a1 WatchSource:0}: Error finding container c3062dea231c2fc1be13c4bb3d21328a3cd4327fe45afe7897a25923d21bc3a1: Status 404 returned error can't find the container with id c3062dea231c2fc1be13c4bb3d21328a3cd4327fe45afe7897a25923d21bc3a1 Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.159102 4874 generic.go:334] "Generic (PLEG): container finished" podID="eec2d818-be5e-4803-9725-bb1565d2ba99" containerID="686c7516c000442381cbd3dfa6910680fee891cff6944def8e0df997915d7417" exitCode=0 Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.159153 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g84nj" event={"ID":"eec2d818-be5e-4803-9725-bb1565d2ba99","Type":"ContainerDied","Data":"686c7516c000442381cbd3dfa6910680fee891cff6944def8e0df997915d7417"} Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.166356 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmxgk" event={"ID":"c73e816b-0d86-414e-896d-7e4b2d0f79ac","Type":"ContainerStarted","Data":"7522d7d07820ecb6cdcdfd541bf6823296a0ca1f714ba39c207ebe8857bf2755"} Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.166400 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmxgk" event={"ID":"c73e816b-0d86-414e-896d-7e4b2d0f79ac","Type":"ContainerStarted","Data":"c3062dea231c2fc1be13c4bb3d21328a3cd4327fe45afe7897a25923d21bc3a1"} Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.168485 4874 generic.go:334] "Generic (PLEG): container finished" podID="5432e358-16b1-407d-9b0a-e07fa48e6a0d" containerID="d20ae72950f3b66ad23a3f85589dabfde04ed7c678862e4a35779e33eab68875" exitCode=0 Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.168534 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9jch" event={"ID":"5432e358-16b1-407d-9b0a-e07fa48e6a0d","Type":"ContainerDied","Data":"d20ae72950f3b66ad23a3f85589dabfde04ed7c678862e4a35779e33eab68875"} Jan 26 00:13:42 crc kubenswrapper[4874]: I0126 00:13:42.221907 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbqx6"] Jan 26 00:13:42 crc kubenswrapper[4874]: W0126 00:13:42.268060 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4463e26e_6510_4fe5_8e96_c3035234bd88.slice/crio-d6b14928e913ad756140363684ff92b30ae6c5127108d7391742cb9a34902d55 WatchSource:0}: Error finding container d6b14928e913ad756140363684ff92b30ae6c5127108d7391742cb9a34902d55: Status 404 returned error can't find the container with id d6b14928e913ad756140363684ff92b30ae6c5127108d7391742cb9a34902d55 Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.176678 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9jch" event={"ID":"5432e358-16b1-407d-9b0a-e07fa48e6a0d","Type":"ContainerStarted","Data":"473562d42eb72c9173a2a9994258d033b5dbd2de7ae0fbde77a81b8938d59a30"} Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.179317 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g84nj" event={"ID":"eec2d818-be5e-4803-9725-bb1565d2ba99","Type":"ContainerStarted","Data":"902f03a5f3d86867ee8596358fe5f7aa321c60ba5282195013cc00b8e960eeba"} Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.184501 4874 generic.go:334] "Generic (PLEG): container finished" podID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerID="7522d7d07820ecb6cdcdfd541bf6823296a0ca1f714ba39c207ebe8857bf2755" exitCode=0 Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.184562 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmxgk" event={"ID":"c73e816b-0d86-414e-896d-7e4b2d0f79ac","Type":"ContainerDied","Data":"7522d7d07820ecb6cdcdfd541bf6823296a0ca1f714ba39c207ebe8857bf2755"} Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.185977 4874 generic.go:334] "Generic (PLEG): container finished" podID="4463e26e-6510-4fe5-8e96-c3035234bd88" containerID="9f3f94b57754d3a8b5a93d5582a9a61c4e8021b54041f47378030b5dcffcae3f" exitCode=0 Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.186014 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqx6" event={"ID":"4463e26e-6510-4fe5-8e96-c3035234bd88","Type":"ContainerDied","Data":"9f3f94b57754d3a8b5a93d5582a9a61c4e8021b54041f47378030b5dcffcae3f"} Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.186035 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqx6" event={"ID":"4463e26e-6510-4fe5-8e96-c3035234bd88","Type":"ContainerStarted","Data":"d6b14928e913ad756140363684ff92b30ae6c5127108d7391742cb9a34902d55"} Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.197398 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r9jch" podStartSLOduration=1.49645719 podStartE2EDuration="4.197375229s" podCreationTimestamp="2026-01-26 00:13:39 +0000 UTC" firstStartedPulling="2026-01-26 00:13:40.141705592 +0000 UTC m=+394.875812159" lastFinishedPulling="2026-01-26 00:13:42.842623661 +0000 UTC m=+397.576730198" observedRunningTime="2026-01-26 00:13:43.19447177 +0000 UTC m=+397.928578317" watchObservedRunningTime="2026-01-26 00:13:43.197375229 +0000 UTC m=+397.931481776" Jan 26 00:13:43 crc kubenswrapper[4874]: I0126 00:13:43.231907 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g84nj" podStartSLOduration=2.66634085 podStartE2EDuration="5.231885856s" podCreationTimestamp="2026-01-26 00:13:38 +0000 UTC" firstStartedPulling="2026-01-26 00:13:40.142209256 +0000 UTC m=+394.876315823" lastFinishedPulling="2026-01-26 00:13:42.707754292 +0000 UTC m=+397.441860829" observedRunningTime="2026-01-26 00:13:43.229992604 +0000 UTC m=+397.964099151" watchObservedRunningTime="2026-01-26 00:13:43.231885856 +0000 UTC m=+397.965992393" Jan 26 00:13:45 crc kubenswrapper[4874]: I0126 00:13:45.204406 4874 generic.go:334] "Generic (PLEG): container finished" podID="4463e26e-6510-4fe5-8e96-c3035234bd88" containerID="6bdf8890cf1d1ae4b6437c4989634ddfebf1c4e0603931e4d2dcfb9369b6941c" exitCode=0 Jan 26 00:13:45 crc kubenswrapper[4874]: I0126 00:13:45.204476 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqx6" event={"ID":"4463e26e-6510-4fe5-8e96-c3035234bd88","Type":"ContainerDied","Data":"6bdf8890cf1d1ae4b6437c4989634ddfebf1c4e0603931e4d2dcfb9369b6941c"} Jan 26 00:13:45 crc kubenswrapper[4874]: I0126 00:13:45.207440 4874 generic.go:334] "Generic (PLEG): container finished" podID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerID="d831012e2409bfeeefc7ef78161fd96be4166f18001de3535ae03f1ae52e93de" exitCode=0 Jan 26 00:13:45 crc kubenswrapper[4874]: I0126 00:13:45.207503 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmxgk" event={"ID":"c73e816b-0d86-414e-896d-7e4b2d0f79ac","Type":"ContainerDied","Data":"d831012e2409bfeeefc7ef78161fd96be4166f18001de3535ae03f1ae52e93de"} Jan 26 00:13:48 crc kubenswrapper[4874]: I0126 00:13:48.105928 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-m5fbw" Jan 26 00:13:48 crc kubenswrapper[4874]: I0126 00:13:48.151551 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nmgqb"] Jan 26 00:13:48 crc kubenswrapper[4874]: I0126 00:13:48.241765 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmxgk" event={"ID":"c73e816b-0d86-414e-896d-7e4b2d0f79ac","Type":"ContainerStarted","Data":"6c19d00bb255d512b501acea99a6fefd7c255ee6f02739bbc4fe5237d5dafbc7"} Jan 26 00:13:48 crc kubenswrapper[4874]: I0126 00:13:48.245264 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqx6" event={"ID":"4463e26e-6510-4fe5-8e96-c3035234bd88","Type":"ContainerStarted","Data":"a558827ea3f737588862eb252c71b7bedad596aadb37166628c613341fb92e41"} Jan 26 00:13:48 crc kubenswrapper[4874]: I0126 00:13:48.261680 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pmxgk" podStartSLOduration=4.543167831 podStartE2EDuration="7.261658191s" podCreationTimestamp="2026-01-26 00:13:41 +0000 UTC" firstStartedPulling="2026-01-26 00:13:43.18680634 +0000 UTC m=+397.920912887" lastFinishedPulling="2026-01-26 00:13:45.90529667 +0000 UTC m=+400.639403247" observedRunningTime="2026-01-26 00:13:48.259502662 +0000 UTC m=+402.993609209" watchObservedRunningTime="2026-01-26 00:13:48.261658191 +0000 UTC m=+402.995764728" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.209541 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.209608 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.253523 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.279265 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbqx6" podStartSLOduration=5.242506429 podStartE2EDuration="8.279237037s" podCreationTimestamp="2026-01-26 00:13:41 +0000 UTC" firstStartedPulling="2026-01-26 00:13:43.187219721 +0000 UTC m=+397.921326268" lastFinishedPulling="2026-01-26 00:13:46.223950319 +0000 UTC m=+400.958056876" observedRunningTime="2026-01-26 00:13:48.285853834 +0000 UTC m=+403.019960371" watchObservedRunningTime="2026-01-26 00:13:49.279237037 +0000 UTC m=+404.013343574" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.301110 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g84nj" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.416563 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.416608 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:49 crc kubenswrapper[4874]: I0126 00:13:49.451546 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:50 crc kubenswrapper[4874]: I0126 00:13:50.302680 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r9jch" Jan 26 00:13:51 crc kubenswrapper[4874]: I0126 00:13:51.638221 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:51 crc kubenswrapper[4874]: I0126 00:13:51.638680 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:51 crc kubenswrapper[4874]: I0126 00:13:51.684638 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:51 crc kubenswrapper[4874]: I0126 00:13:51.804109 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:51 crc kubenswrapper[4874]: I0126 00:13:51.804190 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:13:52 crc kubenswrapper[4874]: I0126 00:13:52.316465 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:13:52 crc kubenswrapper[4874]: I0126 00:13:52.443953 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:13:52 crc kubenswrapper[4874]: I0126 00:13:52.444035 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:13:52 crc kubenswrapper[4874]: I0126 00:13:52.841745 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qbqx6" podUID="4463e26e-6510-4fe5-8e96-c3035234bd88" containerName="registry-server" probeResult="failure" output=< Jan 26 00:13:52 crc kubenswrapper[4874]: timeout: failed to connect service ":50051" within 1s Jan 26 00:13:52 crc kubenswrapper[4874]: > Jan 26 00:14:01 crc kubenswrapper[4874]: I0126 00:14:01.865884 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:14:01 crc kubenswrapper[4874]: I0126 00:14:01.921943 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbqx6" Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.194824 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" podUID="5e4be981-c476-4a3d-a3a5-85e3a30b8376" containerName="registry" containerID="cri-o://90099ad346900b5918821d2af4946f8ce68106e3491f3c75433cfd570a313e01" gracePeriod=30 Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.409153 4874 generic.go:334] "Generic (PLEG): container finished" podID="5e4be981-c476-4a3d-a3a5-85e3a30b8376" containerID="90099ad346900b5918821d2af4946f8ce68106e3491f3c75433cfd570a313e01" exitCode=0 Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.409225 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" event={"ID":"5e4be981-c476-4a3d-a3a5-85e3a30b8376","Type":"ContainerDied","Data":"90099ad346900b5918821d2af4946f8ce68106e3491f3c75433cfd570a313e01"} Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.797914 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.984697 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-bound-sa-token\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.984799 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e4be981-c476-4a3d-a3a5-85e3a30b8376-installation-pull-secrets\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.984832 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e4be981-c476-4a3d-a3a5-85e3a30b8376-ca-trust-extracted\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.984893 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-certificates\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.984943 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-tls\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.985227 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.985274 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksq99\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-kube-api-access-ksq99\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.985337 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-trusted-ca\") pod \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\" (UID: \"5e4be981-c476-4a3d-a3a5-85e3a30b8376\") " Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.986633 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.986776 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.992031 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-kube-api-access-ksq99" (OuterVolumeSpecName: "kube-api-access-ksq99") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "kube-api-access-ksq99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.992642 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4be981-c476-4a3d-a3a5-85e3a30b8376-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.993410 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:14:13 crc kubenswrapper[4874]: I0126 00:14:13.994782 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.002130 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.021012 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e4be981-c476-4a3d-a3a5-85e3a30b8376-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5e4be981-c476-4a3d-a3a5-85e3a30b8376" (UID: "5e4be981-c476-4a3d-a3a5-85e3a30b8376"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.086822 4874 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.086868 4874 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e4be981-c476-4a3d-a3a5-85e3a30b8376-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.086883 4874 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e4be981-c476-4a3d-a3a5-85e3a30b8376-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.086895 4874 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.086909 4874 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.086920 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksq99\" (UniqueName: \"kubernetes.io/projected/5e4be981-c476-4a3d-a3a5-85e3a30b8376-kube-api-access-ksq99\") on node \"crc\" DevicePath \"\"" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.086932 4874 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e4be981-c476-4a3d-a3a5-85e3a30b8376-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.419077 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" event={"ID":"5e4be981-c476-4a3d-a3a5-85e3a30b8376","Type":"ContainerDied","Data":"8cd09f0357449a8c1b7e71e944a43afc04a539c47ce44f02dd7a0ccef6103f08"} Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.420265 4874 scope.go:117] "RemoveContainer" containerID="90099ad346900b5918821d2af4946f8ce68106e3491f3c75433cfd570a313e01" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.420406 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nmgqb" Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.469036 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nmgqb"] Jan 26 00:14:14 crc kubenswrapper[4874]: I0126 00:14:14.473695 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nmgqb"] Jan 26 00:14:15 crc kubenswrapper[4874]: I0126 00:14:15.627699 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4be981-c476-4a3d-a3a5-85e3a30b8376" path="/var/lib/kubelet/pods/5e4be981-c476-4a3d-a3a5-85e3a30b8376/volumes" Jan 26 00:14:22 crc kubenswrapper[4874]: I0126 00:14:22.444514 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:14:22 crc kubenswrapper[4874]: I0126 00:14:22.445756 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:14:22 crc kubenswrapper[4874]: I0126 00:14:22.445916 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:14:22 crc kubenswrapper[4874]: I0126 00:14:22.446655 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5c45472b82d54943922a48ae40bec07ff1c9ab6f870bd221c29915f338518d0"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:14:22 crc kubenswrapper[4874]: I0126 00:14:22.446829 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://e5c45472b82d54943922a48ae40bec07ff1c9ab6f870bd221c29915f338518d0" gracePeriod=600 Jan 26 00:14:23 crc kubenswrapper[4874]: I0126 00:14:23.481549 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="e5c45472b82d54943922a48ae40bec07ff1c9ab6f870bd221c29915f338518d0" exitCode=0 Jan 26 00:14:23 crc kubenswrapper[4874]: I0126 00:14:23.481664 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"e5c45472b82d54943922a48ae40bec07ff1c9ab6f870bd221c29915f338518d0"} Jan 26 00:14:23 crc kubenswrapper[4874]: I0126 00:14:23.481931 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"ae666aaf2203751960dd931caf5d4928ee70c64a2a4871227850c0594dcd70ab"} Jan 26 00:14:23 crc kubenswrapper[4874]: I0126 00:14:23.481984 4874 scope.go:117] "RemoveContainer" containerID="fee69f14be292017188afa5011c08356a7140e640b91718c3e0738db9f2b652b" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.195209 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth"] Jan 26 00:15:00 crc kubenswrapper[4874]: E0126 00:15:00.196137 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4be981-c476-4a3d-a3a5-85e3a30b8376" containerName="registry" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.196160 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4be981-c476-4a3d-a3a5-85e3a30b8376" containerName="registry" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.196358 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4be981-c476-4a3d-a3a5-85e3a30b8376" containerName="registry" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.197085 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.199444 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.202491 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.207264 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth"] Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.327485 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/585fbc65-1085-4d86-a13b-9e88d44a9269-secret-volume\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.327790 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/585fbc65-1085-4d86-a13b-9e88d44a9269-config-volume\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.327816 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4f8f\" (UniqueName: \"kubernetes.io/projected/585fbc65-1085-4d86-a13b-9e88d44a9269-kube-api-access-s4f8f\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.429340 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/585fbc65-1085-4d86-a13b-9e88d44a9269-secret-volume\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.429411 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/585fbc65-1085-4d86-a13b-9e88d44a9269-config-volume\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.429446 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4f8f\" (UniqueName: \"kubernetes.io/projected/585fbc65-1085-4d86-a13b-9e88d44a9269-kube-api-access-s4f8f\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.431561 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/585fbc65-1085-4d86-a13b-9e88d44a9269-config-volume\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.440084 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/585fbc65-1085-4d86-a13b-9e88d44a9269-secret-volume\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.446151 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4f8f\" (UniqueName: \"kubernetes.io/projected/585fbc65-1085-4d86-a13b-9e88d44a9269-kube-api-access-s4f8f\") pod \"collect-profiles-29489775-z8fth\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.525906 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:00 crc kubenswrapper[4874]: I0126 00:15:00.935183 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth"] Jan 26 00:15:00 crc kubenswrapper[4874]: W0126 00:15:00.945488 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585fbc65_1085_4d86_a13b_9e88d44a9269.slice/crio-83a4a3c7315a7bd15dff7af128742352421421e3bd1b7c188a5bf9b0cd58368d WatchSource:0}: Error finding container 83a4a3c7315a7bd15dff7af128742352421421e3bd1b7c188a5bf9b0cd58368d: Status 404 returned error can't find the container with id 83a4a3c7315a7bd15dff7af128742352421421e3bd1b7c188a5bf9b0cd58368d Jan 26 00:15:01 crc kubenswrapper[4874]: I0126 00:15:01.126917 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" event={"ID":"585fbc65-1085-4d86-a13b-9e88d44a9269","Type":"ContainerStarted","Data":"4daead130eb26312cfcb130de509a02a23dc20383771dfe43f1bfe1124eca9c2"} Jan 26 00:15:01 crc kubenswrapper[4874]: I0126 00:15:01.129144 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" event={"ID":"585fbc65-1085-4d86-a13b-9e88d44a9269","Type":"ContainerStarted","Data":"83a4a3c7315a7bd15dff7af128742352421421e3bd1b7c188a5bf9b0cd58368d"} Jan 26 00:15:01 crc kubenswrapper[4874]: I0126 00:15:01.146605 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" podStartSLOduration=1.146582129 podStartE2EDuration="1.146582129s" podCreationTimestamp="2026-01-26 00:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:15:01.145325286 +0000 UTC m=+475.879431833" watchObservedRunningTime="2026-01-26 00:15:01.146582129 +0000 UTC m=+475.880688676" Jan 26 00:15:02 crc kubenswrapper[4874]: I0126 00:15:02.134884 4874 generic.go:334] "Generic (PLEG): container finished" podID="585fbc65-1085-4d86-a13b-9e88d44a9269" containerID="4daead130eb26312cfcb130de509a02a23dc20383771dfe43f1bfe1124eca9c2" exitCode=0 Jan 26 00:15:02 crc kubenswrapper[4874]: I0126 00:15:02.135060 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" event={"ID":"585fbc65-1085-4d86-a13b-9e88d44a9269","Type":"ContainerDied","Data":"4daead130eb26312cfcb130de509a02a23dc20383771dfe43f1bfe1124eca9c2"} Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.422969 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.475786 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/585fbc65-1085-4d86-a13b-9e88d44a9269-config-volume\") pod \"585fbc65-1085-4d86-a13b-9e88d44a9269\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.475847 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4f8f\" (UniqueName: \"kubernetes.io/projected/585fbc65-1085-4d86-a13b-9e88d44a9269-kube-api-access-s4f8f\") pod \"585fbc65-1085-4d86-a13b-9e88d44a9269\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.475940 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/585fbc65-1085-4d86-a13b-9e88d44a9269-secret-volume\") pod \"585fbc65-1085-4d86-a13b-9e88d44a9269\" (UID: \"585fbc65-1085-4d86-a13b-9e88d44a9269\") " Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.477195 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585fbc65-1085-4d86-a13b-9e88d44a9269-config-volume" (OuterVolumeSpecName: "config-volume") pod "585fbc65-1085-4d86-a13b-9e88d44a9269" (UID: "585fbc65-1085-4d86-a13b-9e88d44a9269"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.485568 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585fbc65-1085-4d86-a13b-9e88d44a9269-kube-api-access-s4f8f" (OuterVolumeSpecName: "kube-api-access-s4f8f") pod "585fbc65-1085-4d86-a13b-9e88d44a9269" (UID: "585fbc65-1085-4d86-a13b-9e88d44a9269"). InnerVolumeSpecName "kube-api-access-s4f8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.485608 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585fbc65-1085-4d86-a13b-9e88d44a9269-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "585fbc65-1085-4d86-a13b-9e88d44a9269" (UID: "585fbc65-1085-4d86-a13b-9e88d44a9269"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.577698 4874 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/585fbc65-1085-4d86-a13b-9e88d44a9269-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.577740 4874 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/585fbc65-1085-4d86-a13b-9e88d44a9269-config-volume\") on node \"crc\" DevicePath \"\"" Jan 26 00:15:03 crc kubenswrapper[4874]: I0126 00:15:03.577749 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4f8f\" (UniqueName: \"kubernetes.io/projected/585fbc65-1085-4d86-a13b-9e88d44a9269-kube-api-access-s4f8f\") on node \"crc\" DevicePath \"\"" Jan 26 00:15:04 crc kubenswrapper[4874]: I0126 00:15:04.149833 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" event={"ID":"585fbc65-1085-4d86-a13b-9e88d44a9269","Type":"ContainerDied","Data":"83a4a3c7315a7bd15dff7af128742352421421e3bd1b7c188a5bf9b0cd58368d"} Jan 26 00:15:04 crc kubenswrapper[4874]: I0126 00:15:04.150290 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a4a3c7315a7bd15dff7af128742352421421e3bd1b7c188a5bf9b0cd58368d" Jan 26 00:15:04 crc kubenswrapper[4874]: I0126 00:15:04.149920 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489775-z8fth" Jan 26 00:16:52 crc kubenswrapper[4874]: I0126 00:16:52.444442 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:16:52 crc kubenswrapper[4874]: I0126 00:16:52.445223 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:17:22 crc kubenswrapper[4874]: I0126 00:17:22.444331 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:17:22 crc kubenswrapper[4874]: I0126 00:17:22.445268 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:17:52 crc kubenswrapper[4874]: I0126 00:17:52.444822 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:17:52 crc kubenswrapper[4874]: I0126 00:17:52.445821 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:17:52 crc kubenswrapper[4874]: I0126 00:17:52.445905 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:17:52 crc kubenswrapper[4874]: I0126 00:17:52.446795 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae666aaf2203751960dd931caf5d4928ee70c64a2a4871227850c0594dcd70ab"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:17:52 crc kubenswrapper[4874]: I0126 00:17:52.446852 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://ae666aaf2203751960dd931caf5d4928ee70c64a2a4871227850c0594dcd70ab" gracePeriod=600 Jan 26 00:17:53 crc kubenswrapper[4874]: I0126 00:17:53.311935 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="ae666aaf2203751960dd931caf5d4928ee70c64a2a4871227850c0594dcd70ab" exitCode=0 Jan 26 00:17:53 crc kubenswrapper[4874]: I0126 00:17:53.312022 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"ae666aaf2203751960dd931caf5d4928ee70c64a2a4871227850c0594dcd70ab"} Jan 26 00:17:53 crc kubenswrapper[4874]: I0126 00:17:53.312569 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"80cb2c7272f4e2f3e98351d6918466cfb5e254dd1e4e8cac8003d20db9e8a1cb"} Jan 26 00:17:53 crc kubenswrapper[4874]: I0126 00:17:53.312596 4874 scope.go:117] "RemoveContainer" containerID="e5c45472b82d54943922a48ae40bec07ff1c9ab6f870bd221c29915f338518d0" Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.784036 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dvmzv"] Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.785520 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-controller" containerID="cri-o://66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" gracePeriod=30 Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.786136 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="sbdb" containerID="cri-o://000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" gracePeriod=30 Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.786215 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="nbdb" containerID="cri-o://8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" gracePeriod=30 Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.786276 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="northd" containerID="cri-o://830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" gracePeriod=30 Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.787495 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" gracePeriod=30 Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.787648 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-node" containerID="cri-o://6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" gracePeriod=30 Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.787760 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-acl-logging" containerID="cri-o://3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" gracePeriod=30 Jan 26 00:18:55 crc kubenswrapper[4874]: I0126 00:18:55.830294 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" containerID="cri-o://8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" gracePeriod=30 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.210513 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/3.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.214742 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovn-acl-logging/0.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.215439 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovn-controller/0.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.216006 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280535 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lqbpb"] Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280788 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="nbdb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280803 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="nbdb" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280816 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="sbdb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280821 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="sbdb" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280831 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280837 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280843 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585fbc65-1085-4d86-a13b-9e88d44a9269" containerName="collect-profiles" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280850 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="585fbc65-1085-4d86-a13b-9e88d44a9269" containerName="collect-profiles" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280861 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280867 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280875 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280881 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280887 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-acl-logging" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280894 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-acl-logging" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280902 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kubecfg-setup" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280908 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kubecfg-setup" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280916 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-ovn-metrics" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280922 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-ovn-metrics" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280930 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280935 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280943 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="northd" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280949 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="northd" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.280976 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-node" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.280982 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-node" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281091 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281101 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281108 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281114 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="sbdb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281119 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="585fbc65-1085-4d86-a13b-9e88d44a9269" containerName="collect-profiles" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281126 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-ovn-metrics" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281133 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="northd" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281141 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovn-acl-logging" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281150 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="kube-rbac-proxy-node" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281157 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="nbdb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281165 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281171 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.281250 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281257 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.281267 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281272 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.281359 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a8f534-7005-4a14-ae01-0a689a541502" containerName="ovnkube-controller" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.282775 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318592 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-netd\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318644 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-slash\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318676 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-bin\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318700 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-var-lib-cni-networks-ovn-kubernetes\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318730 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318769 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-log-socket\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318739 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318881 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-script-lib\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318951 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-ovn-kubernetes\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318885 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-log-socket" (OuterVolumeSpecName: "log-socket") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318907 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-slash" (OuterVolumeSpecName: "host-slash") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318921 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.318937 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319016 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319274 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-systemd\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319757 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319805 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-node-log\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319870 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-systemd-units\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319884 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-node-log" (OuterVolumeSpecName: "node-log") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319901 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-kubelet\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319927 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319958 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-config\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.319997 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320008 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-netns\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320069 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320077 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-openvswitch\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320118 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-etc-openvswitch\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320151 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-env-overrides\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320181 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320184 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52hgl\" (UniqueName: \"kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320215 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320232 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-ovn\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320430 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-var-lib-openvswitch\") pod \"79a8f534-7005-4a14-ae01-0a689a541502\" (UID: \"79a8f534-7005-4a14-ae01-0a689a541502\") " Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320494 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320533 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320717 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320771 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-run-netns\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320786 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320810 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-cni-bin\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320837 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-node-log\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320874 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-ovn\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320915 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.320947 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-env-overrides\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321020 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-systemd\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321047 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-etc-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321081 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321117 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-systemd-units\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321187 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovnkube-script-lib\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321270 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-kubelet\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321322 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovnkube-config\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321366 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-cni-netd\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321407 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovn-node-metrics-cert\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321448 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-slash\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321480 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-var-lib-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321505 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-log-socket\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321525 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtjnc\" (UniqueName: \"kubernetes.io/projected/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-kube-api-access-vtjnc\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321684 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321804 4874 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321829 4874 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321841 4874 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-slash\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321853 4874 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321868 4874 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321881 4874 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-log-socket\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321909 4874 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321921 4874 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321933 4874 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-node-log\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321942 4874 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321951 4874 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321979 4874 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.321992 4874 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.322002 4874 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.322011 4874 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.322020 4874 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a8f534-7005-4a14-ae01-0a689a541502-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.322031 4874 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.327620 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl" (OuterVolumeSpecName: "kube-api-access-52hgl") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "kube-api-access-52hgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.327855 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.343430 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "79a8f534-7005-4a14-ae01-0a689a541502" (UID: "79a8f534-7005-4a14-ae01-0a689a541502"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423244 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-systemd\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423325 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-etc-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423357 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423393 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-systemd-units\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423419 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovnkube-script-lib\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423448 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-kubelet\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423459 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-systemd\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423482 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovnkube-config\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423545 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-systemd-units\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423544 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423587 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-kubelet\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423657 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-cni-netd\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423676 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-etc-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423616 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-cni-netd\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423877 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovn-node-metrics-cert\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423925 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-slash\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423950 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-var-lib-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424037 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-slash\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424116 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-var-lib-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.423994 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-log-socket\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424191 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtjnc\" (UniqueName: \"kubernetes.io/projected/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-kube-api-access-vtjnc\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424234 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-log-socket\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424260 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424374 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-openvswitch\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424324 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-run-netns\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424427 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-cni-bin\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424474 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-run-netns\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424514 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-node-log\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424578 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-ovn\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424597 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-cni-bin\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424614 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovnkube-config\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424650 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-run-ovn\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424662 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-node-log\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424746 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424676 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424812 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovnkube-script-lib\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424820 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-env-overrides\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424933 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52hgl\" (UniqueName: \"kubernetes.io/projected/79a8f534-7005-4a14-ae01-0a689a541502-kube-api-access-52hgl\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.424990 4874 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a8f534-7005-4a14-ae01-0a689a541502-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.425009 4874 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a8f534-7005-4a14-ae01-0a689a541502-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.425693 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-env-overrides\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.428633 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-ovn-node-metrics-cert\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.455589 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtjnc\" (UniqueName: \"kubernetes.io/projected/83cc4b9c-e19d-4f20-928f-29a1d83e3b5f-kube-api-access-vtjnc\") pod \"ovnkube-node-lqbpb\" (UID: \"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.599741 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.762350 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/2.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.763145 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/1.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.763214 4874 generic.go:334] "Generic (PLEG): container finished" podID="23d0d511-4c29-434d-92b1-74ca6909e53f" containerID="8c5547c0864eeb4449f5d18919847bde87431e8a6f0a19ace3ce338c4d26312f" exitCode=2 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.763268 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerDied","Data":"8c5547c0864eeb4449f5d18919847bde87431e8a6f0a19ace3ce338c4d26312f"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.763825 4874 scope.go:117] "RemoveContainer" containerID="5ba82c584f08b6c8ce34755705ae1b63204b635a98504786c3209b2f2b9a3588" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.764448 4874 scope.go:117] "RemoveContainer" containerID="8c5547c0864eeb4449f5d18919847bde87431e8a6f0a19ace3ce338c4d26312f" Jan 26 00:18:56 crc kubenswrapper[4874]: E0126 00:18:56.764805 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qjn7c_openshift-multus(23d0d511-4c29-434d-92b1-74ca6909e53f)\"" pod="openshift-multus/multus-qjn7c" podUID="23d0d511-4c29-434d-92b1-74ca6909e53f" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.776604 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovnkube-controller/3.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.780123 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovn-acl-logging/0.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.781571 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dvmzv_79a8f534-7005-4a14-ae01-0a689a541502/ovn-controller/0.log" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782546 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" exitCode=0 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782582 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" exitCode=0 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782594 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" exitCode=0 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782611 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" exitCode=0 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782621 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" exitCode=0 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782630 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" exitCode=0 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782640 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" exitCode=143 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782649 4874 generic.go:334] "Generic (PLEG): container finished" podID="79a8f534-7005-4a14-ae01-0a689a541502" containerID="66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" exitCode=143 Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782650 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782709 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782724 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782737 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782749 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782761 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782777 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782791 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782797 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782803 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.783696 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.783730 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.782706 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784686 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784704 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784710 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784720 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784731 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784753 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784760 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784765 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784770 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784776 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784781 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784786 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784792 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784798 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784805 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784812 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784821 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784829 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784834 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784839 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784844 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784850 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784856 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784862 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784867 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784872 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784879 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dvmzv" event={"ID":"79a8f534-7005-4a14-ae01-0a689a541502","Type":"ContainerDied","Data":"553c8860dba1935ef1d302365108e2e24a0bac9817de49dd9df4618177b49703"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784888 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784918 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784924 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784930 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784935 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784941 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784945 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784951 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784979 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.784985 4874 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.797926 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"dff7c63b29df4df0f0f6f09dbec18d721aeccd21e6997c54852fb64227f3098c"} Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.829625 4874 scope.go:117] "RemoveContainer" containerID="8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.848591 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dvmzv"] Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.853486 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dvmzv"] Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.854639 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.879046 4874 scope.go:117] "RemoveContainer" containerID="000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.899125 4874 scope.go:117] "RemoveContainer" containerID="8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.922086 4874 scope.go:117] "RemoveContainer" containerID="830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.942635 4874 scope.go:117] "RemoveContainer" containerID="26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.964675 4874 scope.go:117] "RemoveContainer" containerID="6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.983336 4874 scope.go:117] "RemoveContainer" containerID="3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" Jan 26 00:18:56 crc kubenswrapper[4874]: I0126 00:18:56.998699 4874 scope.go:117] "RemoveContainer" containerID="66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.019788 4874 scope.go:117] "RemoveContainer" containerID="a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.037380 4874 scope.go:117] "RemoveContainer" containerID="8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.038007 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": container with ID starting with 8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795 not found: ID does not exist" containerID="8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.038064 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} err="failed to get container status \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": rpc error: code = NotFound desc = could not find container \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": container with ID starting with 8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.038102 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.038928 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": container with ID starting with 6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738 not found: ID does not exist" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.039089 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} err="failed to get container status \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": rpc error: code = NotFound desc = could not find container \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": container with ID starting with 6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.039141 4874 scope.go:117] "RemoveContainer" containerID="000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.039554 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": container with ID starting with 000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a not found: ID does not exist" containerID="000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.039591 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} err="failed to get container status \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": rpc error: code = NotFound desc = could not find container \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": container with ID starting with 000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.039613 4874 scope.go:117] "RemoveContainer" containerID="8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.040008 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": container with ID starting with 8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc not found: ID does not exist" containerID="8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.040052 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} err="failed to get container status \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": rpc error: code = NotFound desc = could not find container \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": container with ID starting with 8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.040078 4874 scope.go:117] "RemoveContainer" containerID="830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.040436 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": container with ID starting with 830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d not found: ID does not exist" containerID="830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.040470 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} err="failed to get container status \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": rpc error: code = NotFound desc = could not find container \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": container with ID starting with 830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.040488 4874 scope.go:117] "RemoveContainer" containerID="26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.040989 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": container with ID starting with 26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab not found: ID does not exist" containerID="26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.041095 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} err="failed to get container status \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": rpc error: code = NotFound desc = could not find container \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": container with ID starting with 26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.041135 4874 scope.go:117] "RemoveContainer" containerID="6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.041505 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": container with ID starting with 6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6 not found: ID does not exist" containerID="6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.041534 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} err="failed to get container status \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": rpc error: code = NotFound desc = could not find container \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": container with ID starting with 6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.041551 4874 scope.go:117] "RemoveContainer" containerID="3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.042044 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": container with ID starting with 3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976 not found: ID does not exist" containerID="3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.042076 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} err="failed to get container status \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": rpc error: code = NotFound desc = could not find container \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": container with ID starting with 3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.042095 4874 scope.go:117] "RemoveContainer" containerID="66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.042487 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": container with ID starting with 66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a not found: ID does not exist" containerID="66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.042537 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} err="failed to get container status \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": rpc error: code = NotFound desc = could not find container \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": container with ID starting with 66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.042570 4874 scope.go:117] "RemoveContainer" containerID="a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c" Jan 26 00:18:57 crc kubenswrapper[4874]: E0126 00:18:57.042939 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": container with ID starting with a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c not found: ID does not exist" containerID="a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.042993 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} err="failed to get container status \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": rpc error: code = NotFound desc = could not find container \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": container with ID starting with a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.043016 4874 scope.go:117] "RemoveContainer" containerID="8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.043480 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} err="failed to get container status \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": rpc error: code = NotFound desc = could not find container \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": container with ID starting with 8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.043510 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.043879 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} err="failed to get container status \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": rpc error: code = NotFound desc = could not find container \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": container with ID starting with 6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.044032 4874 scope.go:117] "RemoveContainer" containerID="000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.044553 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} err="failed to get container status \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": rpc error: code = NotFound desc = could not find container \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": container with ID starting with 000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.044584 4874 scope.go:117] "RemoveContainer" containerID="8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.044993 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} err="failed to get container status \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": rpc error: code = NotFound desc = could not find container \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": container with ID starting with 8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.045080 4874 scope.go:117] "RemoveContainer" containerID="830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.045717 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} err="failed to get container status \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": rpc error: code = NotFound desc = could not find container \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": container with ID starting with 830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.045747 4874 scope.go:117] "RemoveContainer" containerID="26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.046130 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} err="failed to get container status \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": rpc error: code = NotFound desc = could not find container \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": container with ID starting with 26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.046166 4874 scope.go:117] "RemoveContainer" containerID="6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.046607 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} err="failed to get container status \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": rpc error: code = NotFound desc = could not find container \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": container with ID starting with 6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.046638 4874 scope.go:117] "RemoveContainer" containerID="3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.047237 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} err="failed to get container status \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": rpc error: code = NotFound desc = could not find container \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": container with ID starting with 3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.047278 4874 scope.go:117] "RemoveContainer" containerID="66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.047802 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} err="failed to get container status \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": rpc error: code = NotFound desc = could not find container \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": container with ID starting with 66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.047841 4874 scope.go:117] "RemoveContainer" containerID="a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.048243 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} err="failed to get container status \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": rpc error: code = NotFound desc = could not find container \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": container with ID starting with a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.048276 4874 scope.go:117] "RemoveContainer" containerID="8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.048709 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} err="failed to get container status \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": rpc error: code = NotFound desc = could not find container \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": container with ID starting with 8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.048735 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.049203 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} err="failed to get container status \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": rpc error: code = NotFound desc = could not find container \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": container with ID starting with 6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.049258 4874 scope.go:117] "RemoveContainer" containerID="000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.049571 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} err="failed to get container status \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": rpc error: code = NotFound desc = could not find container \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": container with ID starting with 000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.049597 4874 scope.go:117] "RemoveContainer" containerID="8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.050106 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} err="failed to get container status \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": rpc error: code = NotFound desc = could not find container \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": container with ID starting with 8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.050158 4874 scope.go:117] "RemoveContainer" containerID="830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.050520 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} err="failed to get container status \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": rpc error: code = NotFound desc = could not find container \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": container with ID starting with 830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.050554 4874 scope.go:117] "RemoveContainer" containerID="26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.050903 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} err="failed to get container status \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": rpc error: code = NotFound desc = could not find container \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": container with ID starting with 26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.050925 4874 scope.go:117] "RemoveContainer" containerID="6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.051439 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} err="failed to get container status \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": rpc error: code = NotFound desc = could not find container \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": container with ID starting with 6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.051516 4874 scope.go:117] "RemoveContainer" containerID="3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.051874 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} err="failed to get container status \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": rpc error: code = NotFound desc = could not find container \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": container with ID starting with 3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.051905 4874 scope.go:117] "RemoveContainer" containerID="66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.052388 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} err="failed to get container status \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": rpc error: code = NotFound desc = could not find container \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": container with ID starting with 66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.052443 4874 scope.go:117] "RemoveContainer" containerID="a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.052907 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} err="failed to get container status \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": rpc error: code = NotFound desc = could not find container \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": container with ID starting with a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.052991 4874 scope.go:117] "RemoveContainer" containerID="8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.053449 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795"} err="failed to get container status \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": rpc error: code = NotFound desc = could not find container \"8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795\": container with ID starting with 8afa3010ba16ecd0c7c26d07797e59529fdeb25e454bc79f7508fb2b9cbb5795 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.053480 4874 scope.go:117] "RemoveContainer" containerID="6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.053996 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738"} err="failed to get container status \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": rpc error: code = NotFound desc = could not find container \"6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738\": container with ID starting with 6beea11c7605a0f7eb594ef9c22c0a82fb1ceb118273b7fea540d128309d1738 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.054030 4874 scope.go:117] "RemoveContainer" containerID="000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.054444 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a"} err="failed to get container status \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": rpc error: code = NotFound desc = could not find container \"000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a\": container with ID starting with 000e36c3e49b4e3f64070101715fa99fface0833f9eb4028df1f0c6ae9c5d23a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.054470 4874 scope.go:117] "RemoveContainer" containerID="8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.054930 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc"} err="failed to get container status \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": rpc error: code = NotFound desc = could not find container \"8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc\": container with ID starting with 8e434a7736bf9eb6cf07e96626dc66ac8413cd0d6f63c690197c7574e4572dfc not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.054990 4874 scope.go:117] "RemoveContainer" containerID="830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.055325 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d"} err="failed to get container status \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": rpc error: code = NotFound desc = could not find container \"830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d\": container with ID starting with 830cc3f077c10f526e4a01bb6df0c5bfd298527cec0e3bd396ee79411e5d859d not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.055353 4874 scope.go:117] "RemoveContainer" containerID="26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.055758 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab"} err="failed to get container status \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": rpc error: code = NotFound desc = could not find container \"26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab\": container with ID starting with 26297ff5dd91a7d9653fa42a0d791db3270f5f63b6af932b1071a28ad0ee69ab not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.055786 4874 scope.go:117] "RemoveContainer" containerID="6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.056237 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6"} err="failed to get container status \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": rpc error: code = NotFound desc = could not find container \"6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6\": container with ID starting with 6fd8b29b3800140b75f9f4cd290480cc0250a8b31aa1ccb6403a24eef8d48ea6 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.056263 4874 scope.go:117] "RemoveContainer" containerID="3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.056619 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976"} err="failed to get container status \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": rpc error: code = NotFound desc = could not find container \"3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976\": container with ID starting with 3664b6583195c930430a41ffd65e7e4ebc78e76a1d6079a290cd69040228f976 not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.056650 4874 scope.go:117] "RemoveContainer" containerID="66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.056989 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a"} err="failed to get container status \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": rpc error: code = NotFound desc = could not find container \"66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a\": container with ID starting with 66b7612f4fd009865d471008d3d45046f20db72db57130a23683d10d6ba9c85a not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.057017 4874 scope.go:117] "RemoveContainer" containerID="a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.057358 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c"} err="failed to get container status \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": rpc error: code = NotFound desc = could not find container \"a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c\": container with ID starting with a3fcfae53d4a558e4bcee8644c8145c256b8cafb59cd2d4773a4b43d4f217e4c not found: ID does not exist" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.630771 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a8f534-7005-4a14-ae01-0a689a541502" path="/var/lib/kubelet/pods/79a8f534-7005-4a14-ae01-0a689a541502/volumes" Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.809680 4874 generic.go:334] "Generic (PLEG): container finished" podID="83cc4b9c-e19d-4f20-928f-29a1d83e3b5f" containerID="78cdb45387be4736402cf807c6811e68237ca10530c7dd590ba6304ab422bc40" exitCode=0 Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.809766 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerDied","Data":"78cdb45387be4736402cf807c6811e68237ca10530c7dd590ba6304ab422bc40"} Jan 26 00:18:57 crc kubenswrapper[4874]: I0126 00:18:57.813505 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/2.log" Jan 26 00:18:58 crc kubenswrapper[4874]: I0126 00:18:58.826113 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"89ba5202e58560721ab3c87a528de3c1fb9fbc011eae884f3c34eebab58b7719"} Jan 26 00:18:58 crc kubenswrapper[4874]: I0126 00:18:58.826512 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"ff6ac71a7ab25ec3b2eff254ffe0de433bdd9d89356b571574cfe4e7e4be392f"} Jan 26 00:18:58 crc kubenswrapper[4874]: I0126 00:18:58.826536 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"80284115b0ea9a7295c573a8a7f8e6d52355e3403821348576e0e07012ad8bb0"} Jan 26 00:18:58 crc kubenswrapper[4874]: I0126 00:18:58.826555 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"53272d5cd5e24ee9d699017d97cdc3a1e8a924ea10e0d0d6b1e592a3dd11ef14"} Jan 26 00:18:58 crc kubenswrapper[4874]: I0126 00:18:58.826573 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"7b6067f86551b270dad3d9367ea350a0ee118d633981f8f6250d33b00be323ab"} Jan 26 00:18:58 crc kubenswrapper[4874]: I0126 00:18:58.826591 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"57f12c1fd4981b3bfa809b3608a178feb9de791a1075cd776cb6e6dead2baf4c"} Jan 26 00:19:01 crc kubenswrapper[4874]: I0126 00:19:01.857936 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"1b3c83d28ab9354da7b709dd9be18530a763768af6819769f16f1396ba76762d"} Jan 26 00:19:03 crc kubenswrapper[4874]: I0126 00:19:03.877705 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" event={"ID":"83cc4b9c-e19d-4f20-928f-29a1d83e3b5f","Type":"ContainerStarted","Data":"87f9c8a5e9ac36e5ea41e9b1047525920a39588a2a49f2b9bb19c506bb151da6"} Jan 26 00:19:03 crc kubenswrapper[4874]: I0126 00:19:03.878363 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:19:03 crc kubenswrapper[4874]: I0126 00:19:03.878387 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:19:03 crc kubenswrapper[4874]: I0126 00:19:03.878403 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:19:03 crc kubenswrapper[4874]: I0126 00:19:03.904546 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:19:03 crc kubenswrapper[4874]: I0126 00:19:03.913051 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" podStartSLOduration=7.913031124 podStartE2EDuration="7.913031124s" podCreationTimestamp="2026-01-26 00:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:19:03.910619486 +0000 UTC m=+718.644726043" watchObservedRunningTime="2026-01-26 00:19:03.913031124 +0000 UTC m=+718.647137671" Jan 26 00:19:03 crc kubenswrapper[4874]: I0126 00:19:03.919028 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:19:07 crc kubenswrapper[4874]: I0126 00:19:07.620788 4874 scope.go:117] "RemoveContainer" containerID="8c5547c0864eeb4449f5d18919847bde87431e8a6f0a19ace3ce338c4d26312f" Jan 26 00:19:07 crc kubenswrapper[4874]: E0126 00:19:07.622764 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qjn7c_openshift-multus(23d0d511-4c29-434d-92b1-74ca6909e53f)\"" pod="openshift-multus/multus-qjn7c" podUID="23d0d511-4c29-434d-92b1-74ca6909e53f" Jan 26 00:19:21 crc kubenswrapper[4874]: I0126 00:19:21.621032 4874 scope.go:117] "RemoveContainer" containerID="8c5547c0864eeb4449f5d18919847bde87431e8a6f0a19ace3ce338c4d26312f" Jan 26 00:19:21 crc kubenswrapper[4874]: I0126 00:19:21.997475 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjn7c_23d0d511-4c29-434d-92b1-74ca6909e53f/kube-multus/2.log" Jan 26 00:19:21 crc kubenswrapper[4874]: I0126 00:19:21.997815 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjn7c" event={"ID":"23d0d511-4c29-434d-92b1-74ca6909e53f","Type":"ContainerStarted","Data":"fee66b9036992fbe8472ff19743939b25c3a8a6a851d24d116f5d0f1e7cc2bd4"} Jan 26 00:19:26 crc kubenswrapper[4874]: I0126 00:19:26.638264 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqbpb" Jan 26 00:19:47 crc kubenswrapper[4874]: I0126 00:19:47.156847 4874 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 26 00:19:52 crc kubenswrapper[4874]: I0126 00:19:52.444485 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:19:52 crc kubenswrapper[4874]: I0126 00:19:52.446611 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.149314 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmxgk"] Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.150398 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pmxgk" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="registry-server" containerID="cri-o://6c19d00bb255d512b501acea99a6fefd7c255ee6f02739bbc4fe5237d5dafbc7" gracePeriod=30 Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.448930 4874 generic.go:334] "Generic (PLEG): container finished" podID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerID="6c19d00bb255d512b501acea99a6fefd7c255ee6f02739bbc4fe5237d5dafbc7" exitCode=0 Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.448988 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmxgk" event={"ID":"c73e816b-0d86-414e-896d-7e4b2d0f79ac","Type":"ContainerDied","Data":"6c19d00bb255d512b501acea99a6fefd7c255ee6f02739bbc4fe5237d5dafbc7"} Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.530112 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.634298 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-utilities\") pod \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.634408 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj5pj\" (UniqueName: \"kubernetes.io/projected/c73e816b-0d86-414e-896d-7e4b2d0f79ac-kube-api-access-mj5pj\") pod \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.634433 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-catalog-content\") pod \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\" (UID: \"c73e816b-0d86-414e-896d-7e4b2d0f79ac\") " Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.636295 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-utilities" (OuterVolumeSpecName: "utilities") pod "c73e816b-0d86-414e-896d-7e4b2d0f79ac" (UID: "c73e816b-0d86-414e-896d-7e4b2d0f79ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.640672 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73e816b-0d86-414e-896d-7e4b2d0f79ac-kube-api-access-mj5pj" (OuterVolumeSpecName: "kube-api-access-mj5pj") pod "c73e816b-0d86-414e-896d-7e4b2d0f79ac" (UID: "c73e816b-0d86-414e-896d-7e4b2d0f79ac"). InnerVolumeSpecName "kube-api-access-mj5pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.660152 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c73e816b-0d86-414e-896d-7e4b2d0f79ac" (UID: "c73e816b-0d86-414e-896d-7e4b2d0f79ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.736565 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj5pj\" (UniqueName: \"kubernetes.io/projected/c73e816b-0d86-414e-896d-7e4b2d0f79ac-kube-api-access-mj5pj\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.736619 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:01 crc kubenswrapper[4874]: I0126 00:20:01.736640 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c73e816b-0d86-414e-896d-7e4b2d0f79ac-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:02 crc kubenswrapper[4874]: I0126 00:20:02.458503 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmxgk" event={"ID":"c73e816b-0d86-414e-896d-7e4b2d0f79ac","Type":"ContainerDied","Data":"c3062dea231c2fc1be13c4bb3d21328a3cd4327fe45afe7897a25923d21bc3a1"} Jan 26 00:20:02 crc kubenswrapper[4874]: I0126 00:20:02.458580 4874 scope.go:117] "RemoveContainer" containerID="6c19d00bb255d512b501acea99a6fefd7c255ee6f02739bbc4fe5237d5dafbc7" Jan 26 00:20:02 crc kubenswrapper[4874]: I0126 00:20:02.458629 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmxgk" Jan 26 00:20:02 crc kubenswrapper[4874]: I0126 00:20:02.476159 4874 scope.go:117] "RemoveContainer" containerID="d831012e2409bfeeefc7ef78161fd96be4166f18001de3535ae03f1ae52e93de" Jan 26 00:20:02 crc kubenswrapper[4874]: I0126 00:20:02.492258 4874 scope.go:117] "RemoveContainer" containerID="7522d7d07820ecb6cdcdfd541bf6823296a0ca1f714ba39c207ebe8857bf2755" Jan 26 00:20:02 crc kubenswrapper[4874]: I0126 00:20:02.499463 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmxgk"] Jan 26 00:20:02 crc kubenswrapper[4874]: I0126 00:20:02.504108 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmxgk"] Jan 26 00:20:03 crc kubenswrapper[4874]: I0126 00:20:03.628270 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" path="/var/lib/kubelet/pods/c73e816b-0d86-414e-896d-7e4b2d0f79ac/volumes" Jan 26 00:20:04 crc kubenswrapper[4874]: I0126 00:20:04.985509 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg"] Jan 26 00:20:04 crc kubenswrapper[4874]: E0126 00:20:04.986101 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="extract-content" Jan 26 00:20:04 crc kubenswrapper[4874]: I0126 00:20:04.986125 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="extract-content" Jan 26 00:20:04 crc kubenswrapper[4874]: E0126 00:20:04.986157 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="extract-utilities" Jan 26 00:20:04 crc kubenswrapper[4874]: I0126 00:20:04.986170 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="extract-utilities" Jan 26 00:20:04 crc kubenswrapper[4874]: E0126 00:20:04.986187 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="registry-server" Jan 26 00:20:04 crc kubenswrapper[4874]: I0126 00:20:04.986198 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="registry-server" Jan 26 00:20:04 crc kubenswrapper[4874]: I0126 00:20:04.986379 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73e816b-0d86-414e-896d-7e4b2d0f79ac" containerName="registry-server" Jan 26 00:20:04 crc kubenswrapper[4874]: I0126 00:20:04.987874 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:04 crc kubenswrapper[4874]: I0126 00:20:04.990832 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.016455 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg"] Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.080735 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr6sb\" (UniqueName: \"kubernetes.io/projected/e605a81e-2edf-416f-a26d-3967c0d4e15e-kube-api-access-kr6sb\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.080788 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.081167 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.182663 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.182799 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr6sb\" (UniqueName: \"kubernetes.io/projected/e605a81e-2edf-416f-a26d-3967c0d4e15e-kube-api-access-kr6sb\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.182853 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.183597 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.184042 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.217234 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr6sb\" (UniqueName: \"kubernetes.io/projected/e605a81e-2edf-416f-a26d-3967c0d4e15e-kube-api-access-kr6sb\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.313345 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:05 crc kubenswrapper[4874]: I0126 00:20:05.491025 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg"] Jan 26 00:20:05 crc kubenswrapper[4874]: E0126 00:20:05.992356 4874 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode605a81e_2edf_416f_a26d_3967c0d4e15e.slice/crio-9d7475cafc0f0e07b695461a331ca861777877d1d5f36a6dcb85ffc3dd3457df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode605a81e_2edf_416f_a26d_3967c0d4e15e.slice/crio-conmon-9d7475cafc0f0e07b695461a331ca861777877d1d5f36a6dcb85ffc3dd3457df.scope\": RecentStats: unable to find data in memory cache]" Jan 26 00:20:06 crc kubenswrapper[4874]: I0126 00:20:06.487722 4874 generic.go:334] "Generic (PLEG): container finished" podID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerID="9d7475cafc0f0e07b695461a331ca861777877d1d5f36a6dcb85ffc3dd3457df" exitCode=0 Jan 26 00:20:06 crc kubenswrapper[4874]: I0126 00:20:06.488106 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" event={"ID":"e605a81e-2edf-416f-a26d-3967c0d4e15e","Type":"ContainerDied","Data":"9d7475cafc0f0e07b695461a331ca861777877d1d5f36a6dcb85ffc3dd3457df"} Jan 26 00:20:06 crc kubenswrapper[4874]: I0126 00:20:06.488192 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" event={"ID":"e605a81e-2edf-416f-a26d-3967c0d4e15e","Type":"ContainerStarted","Data":"1839b83f8fc9ec73838a4a87c58bb4a97c4080c87be092673c6545a764da019a"} Jan 26 00:20:06 crc kubenswrapper[4874]: I0126 00:20:06.491425 4874 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.726920 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qb9r8"] Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.730814 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.736542 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qb9r8"] Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.821493 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-utilities\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.821550 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-catalog-content\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.821608 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cns5b\" (UniqueName: \"kubernetes.io/projected/39304a23-bff0-4736-887e-766831f728a3-kube-api-access-cns5b\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.923223 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-catalog-content\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.923679 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cns5b\" (UniqueName: \"kubernetes.io/projected/39304a23-bff0-4736-887e-766831f728a3-kube-api-access-cns5b\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.923950 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-utilities\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.924272 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-catalog-content\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.924948 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-utilities\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:07 crc kubenswrapper[4874]: I0126 00:20:07.955238 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cns5b\" (UniqueName: \"kubernetes.io/projected/39304a23-bff0-4736-887e-766831f728a3-kube-api-access-cns5b\") pod \"redhat-operators-qb9r8\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:08 crc kubenswrapper[4874]: I0126 00:20:08.059354 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:08 crc kubenswrapper[4874]: I0126 00:20:08.255216 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qb9r8"] Jan 26 00:20:08 crc kubenswrapper[4874]: I0126 00:20:08.501119 4874 generic.go:334] "Generic (PLEG): container finished" podID="39304a23-bff0-4736-887e-766831f728a3" containerID="44cec63b3d97762dd3f1e3ac80dc38db6b0e3f82ee15d889ae324a64c18d1fbd" exitCode=0 Jan 26 00:20:08 crc kubenswrapper[4874]: I0126 00:20:08.501192 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qb9r8" event={"ID":"39304a23-bff0-4736-887e-766831f728a3","Type":"ContainerDied","Data":"44cec63b3d97762dd3f1e3ac80dc38db6b0e3f82ee15d889ae324a64c18d1fbd"} Jan 26 00:20:08 crc kubenswrapper[4874]: I0126 00:20:08.501221 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qb9r8" event={"ID":"39304a23-bff0-4736-887e-766831f728a3","Type":"ContainerStarted","Data":"265d5bc206ef053698657e1b6276f28e894dc4d74e235154c942e7f6f082c419"} Jan 26 00:20:08 crc kubenswrapper[4874]: I0126 00:20:08.504512 4874 generic.go:334] "Generic (PLEG): container finished" podID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerID="87175607bbbb2c3664d7041ee0e2a82d58478eea0bc9ac1a8df732115f5d7930" exitCode=0 Jan 26 00:20:08 crc kubenswrapper[4874]: I0126 00:20:08.504560 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" event={"ID":"e605a81e-2edf-416f-a26d-3967c0d4e15e","Type":"ContainerDied","Data":"87175607bbbb2c3664d7041ee0e2a82d58478eea0bc9ac1a8df732115f5d7930"} Jan 26 00:20:09 crc kubenswrapper[4874]: I0126 00:20:09.513319 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qb9r8" event={"ID":"39304a23-bff0-4736-887e-766831f728a3","Type":"ContainerStarted","Data":"21fee66d62e288d372731064695fbbb52964e8e240c526e4ddbe5f638ba637af"} Jan 26 00:20:09 crc kubenswrapper[4874]: I0126 00:20:09.518041 4874 generic.go:334] "Generic (PLEG): container finished" podID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerID="a68d41f3717fe680ec2b075e7e7c99afa620e55d0e6ddedfde2486223055a4ad" exitCode=0 Jan 26 00:20:09 crc kubenswrapper[4874]: I0126 00:20:09.518097 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" event={"ID":"e605a81e-2edf-416f-a26d-3967c0d4e15e","Type":"ContainerDied","Data":"a68d41f3717fe680ec2b075e7e7c99afa620e55d0e6ddedfde2486223055a4ad"} Jan 26 00:20:10 crc kubenswrapper[4874]: I0126 00:20:10.529463 4874 generic.go:334] "Generic (PLEG): container finished" podID="39304a23-bff0-4736-887e-766831f728a3" containerID="21fee66d62e288d372731064695fbbb52964e8e240c526e4ddbe5f638ba637af" exitCode=0 Jan 26 00:20:10 crc kubenswrapper[4874]: I0126 00:20:10.529554 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qb9r8" event={"ID":"39304a23-bff0-4736-887e-766831f728a3","Type":"ContainerDied","Data":"21fee66d62e288d372731064695fbbb52964e8e240c526e4ddbe5f638ba637af"} Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.344316 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.475491 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-bundle\") pod \"e605a81e-2edf-416f-a26d-3967c0d4e15e\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.475549 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr6sb\" (UniqueName: \"kubernetes.io/projected/e605a81e-2edf-416f-a26d-3967c0d4e15e-kube-api-access-kr6sb\") pod \"e605a81e-2edf-416f-a26d-3967c0d4e15e\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.475621 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-util\") pod \"e605a81e-2edf-416f-a26d-3967c0d4e15e\" (UID: \"e605a81e-2edf-416f-a26d-3967c0d4e15e\") " Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.479818 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-bundle" (OuterVolumeSpecName: "bundle") pod "e605a81e-2edf-416f-a26d-3967c0d4e15e" (UID: "e605a81e-2edf-416f-a26d-3967c0d4e15e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.487848 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-util" (OuterVolumeSpecName: "util") pod "e605a81e-2edf-416f-a26d-3967c0d4e15e" (UID: "e605a81e-2edf-416f-a26d-3967c0d4e15e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.489372 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e605a81e-2edf-416f-a26d-3967c0d4e15e-kube-api-access-kr6sb" (OuterVolumeSpecName: "kube-api-access-kr6sb") pod "e605a81e-2edf-416f-a26d-3967c0d4e15e" (UID: "e605a81e-2edf-416f-a26d-3967c0d4e15e"). InnerVolumeSpecName "kube-api-access-kr6sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.541029 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" event={"ID":"e605a81e-2edf-416f-a26d-3967c0d4e15e","Type":"ContainerDied","Data":"1839b83f8fc9ec73838a4a87c58bb4a97c4080c87be092673c6545a764da019a"} Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.541559 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1839b83f8fc9ec73838a4a87c58bb4a97c4080c87be092673c6545a764da019a" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.541165 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.545311 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qb9r8" event={"ID":"39304a23-bff0-4736-887e-766831f728a3","Type":"ContainerStarted","Data":"46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c"} Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.566732 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qb9r8" podStartSLOduration=1.913370548 podStartE2EDuration="4.566692186s" podCreationTimestamp="2026-01-26 00:20:07 +0000 UTC" firstStartedPulling="2026-01-26 00:20:08.504728918 +0000 UTC m=+783.238835455" lastFinishedPulling="2026-01-26 00:20:11.158050546 +0000 UTC m=+785.892157093" observedRunningTime="2026-01-26 00:20:11.565692057 +0000 UTC m=+786.299798594" watchObservedRunningTime="2026-01-26 00:20:11.566692186 +0000 UTC m=+786.300798723" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.577385 4874 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.577419 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr6sb\" (UniqueName: \"kubernetes.io/projected/e605a81e-2edf-416f-a26d-3967c0d4e15e-kube-api-access-kr6sb\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:11 crc kubenswrapper[4874]: I0126 00:20:11.577432 4874 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e605a81e-2edf-416f-a26d-3967c0d4e15e-util\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.183990 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq"] Jan 26 00:20:12 crc kubenswrapper[4874]: E0126 00:20:12.184224 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerName="pull" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.184239 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerName="pull" Jan 26 00:20:12 crc kubenswrapper[4874]: E0126 00:20:12.184258 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerName="util" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.184265 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerName="util" Jan 26 00:20:12 crc kubenswrapper[4874]: E0126 00:20:12.184277 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerName="extract" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.184285 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerName="extract" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.184413 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="e605a81e-2edf-416f-a26d-3967c0d4e15e" containerName="extract" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.185282 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.188893 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.210161 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq"] Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.288279 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.288467 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f948d\" (UniqueName: \"kubernetes.io/projected/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-kube-api-access-f948d\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.288535 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.390263 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f948d\" (UniqueName: \"kubernetes.io/projected/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-kube-api-access-f948d\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.390401 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.390512 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.391147 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.391435 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.412202 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f948d\" (UniqueName: \"kubernetes.io/projected/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-kube-api-access-f948d\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.505887 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:12 crc kubenswrapper[4874]: I0126 00:20:12.790992 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq"] Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.166860 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs"] Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.168247 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.188003 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs"] Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.304671 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.304729 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.304774 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rkh2\" (UniqueName: \"kubernetes.io/projected/1a4e3630-d9f6-442f-9afb-4dff66861ece-kube-api-access-5rkh2\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.406748 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.406842 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.406934 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rkh2\" (UniqueName: \"kubernetes.io/projected/1a4e3630-d9f6-442f-9afb-4dff66861ece-kube-api-access-5rkh2\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.407753 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.407843 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.430788 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rkh2\" (UniqueName: \"kubernetes.io/projected/1a4e3630-d9f6-442f-9afb-4dff66861ece-kube-api-access-5rkh2\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.487773 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.566057 4874 generic.go:334] "Generic (PLEG): container finished" podID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerID="0a2e1c7e5ab967d5dfd22bf808510676ac2f519576f8a4dd5bfd245ffb8d550f" exitCode=0 Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.566353 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" event={"ID":"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f","Type":"ContainerDied","Data":"0a2e1c7e5ab967d5dfd22bf808510676ac2f519576f8a4dd5bfd245ffb8d550f"} Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.566381 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" event={"ID":"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f","Type":"ContainerStarted","Data":"da3ac36752b149cfd86a9d4c3f122fd5b20a4fe3b5014d72e8a56acc058b09b0"} Jan 26 00:20:13 crc kubenswrapper[4874]: I0126 00:20:13.771100 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs"] Jan 26 00:20:14 crc kubenswrapper[4874]: I0126 00:20:14.572991 4874 generic.go:334] "Generic (PLEG): container finished" podID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerID="bac091286474857709977ff7f992e8aaf99e553b5b571f5c850464ddb9fd5438" exitCode=0 Jan 26 00:20:14 crc kubenswrapper[4874]: I0126 00:20:14.574211 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" event={"ID":"1a4e3630-d9f6-442f-9afb-4dff66861ece","Type":"ContainerDied","Data":"bac091286474857709977ff7f992e8aaf99e553b5b571f5c850464ddb9fd5438"} Jan 26 00:20:14 crc kubenswrapper[4874]: I0126 00:20:14.574253 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" event={"ID":"1a4e3630-d9f6-442f-9afb-4dff66861ece","Type":"ContainerStarted","Data":"800320e3300400b5052e3d48e61b940f3f9d892db8c0f4e21d39e11fe67d05dd"} Jan 26 00:20:15 crc kubenswrapper[4874]: I0126 00:20:15.582648 4874 generic.go:334] "Generic (PLEG): container finished" podID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerID="c15aaf2dd47b0d7c993dcb0c0207f9095bd68a3137bb23bf10308ce7a9cd0642" exitCode=0 Jan 26 00:20:15 crc kubenswrapper[4874]: I0126 00:20:15.582734 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" event={"ID":"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f","Type":"ContainerDied","Data":"c15aaf2dd47b0d7c993dcb0c0207f9095bd68a3137bb23bf10308ce7a9cd0642"} Jan 26 00:20:16 crc kubenswrapper[4874]: I0126 00:20:16.589576 4874 generic.go:334] "Generic (PLEG): container finished" podID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerID="f898404e3658624121f26191d9392d5bc98b9140e3596ed0f44288a9b149562d" exitCode=0 Jan 26 00:20:16 crc kubenswrapper[4874]: I0126 00:20:16.589678 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" event={"ID":"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f","Type":"ContainerDied","Data":"f898404e3658624121f26191d9392d5bc98b9140e3596ed0f44288a9b149562d"} Jan 26 00:20:16 crc kubenswrapper[4874]: I0126 00:20:16.591743 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" event={"ID":"1a4e3630-d9f6-442f-9afb-4dff66861ece","Type":"ContainerStarted","Data":"7be8babcec0884ee7959dd001a5a2cd2021ab3ac4d00357aaeba7e6ad7809bb4"} Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.119771 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bxmhq"] Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.121075 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.181670 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxmhq"] Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.272553 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5wln\" (UniqueName: \"kubernetes.io/projected/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-kube-api-access-x5wln\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.272630 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-catalog-content\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.272650 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-utilities\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.415860 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5wln\" (UniqueName: \"kubernetes.io/projected/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-kube-api-access-x5wln\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.415939 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-catalog-content\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.415972 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-utilities\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.416527 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-utilities\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.417035 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-catalog-content\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.538913 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5wln\" (UniqueName: \"kubernetes.io/projected/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-kube-api-access-x5wln\") pod \"certified-operators-bxmhq\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.599355 4874 generic.go:334] "Generic (PLEG): container finished" podID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerID="7be8babcec0884ee7959dd001a5a2cd2021ab3ac4d00357aaeba7e6ad7809bb4" exitCode=0 Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.599457 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" event={"ID":"1a4e3630-d9f6-442f-9afb-4dff66861ece","Type":"ContainerDied","Data":"7be8babcec0884ee7959dd001a5a2cd2021ab3ac4d00357aaeba7e6ad7809bb4"} Jan 26 00:20:17 crc kubenswrapper[4874]: I0126 00:20:17.734418 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.063126 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.063677 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.224972 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.364244 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f948d\" (UniqueName: \"kubernetes.io/projected/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-kube-api-access-f948d\") pod \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.364399 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-util\") pod \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.364651 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-bundle\") pod \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\" (UID: \"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f\") " Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.367908 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-bundle" (OuterVolumeSpecName: "bundle") pod "a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" (UID: "a29bf9c8-7825-4a6e-9ed4-84c1da217e0f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.437702 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-kube-api-access-f948d" (OuterVolumeSpecName: "kube-api-access-f948d") pod "a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" (UID: "a29bf9c8-7825-4a6e-9ed4-84c1da217e0f"). InnerVolumeSpecName "kube-api-access-f948d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.466097 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f948d\" (UniqueName: \"kubernetes.io/projected/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-kube-api-access-f948d\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.466131 4874 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.512202 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxmhq"] Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.596784 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-util" (OuterVolumeSpecName: "util") pod "a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" (UID: "a29bf9c8-7825-4a6e-9ed4-84c1da217e0f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.615005 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" event={"ID":"1a4e3630-d9f6-442f-9afb-4dff66861ece","Type":"ContainerStarted","Data":"2b46402e53c8a6e57e1b4c11c327890b82b8c4774625f3283dee69e05567397d"} Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.618046 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxmhq" event={"ID":"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc","Type":"ContainerStarted","Data":"c56e94e3561031539a772abe721b3ce98f78772520a18ae49d4c0b4ea85211b5"} Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.623592 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" event={"ID":"a29bf9c8-7825-4a6e-9ed4-84c1da217e0f","Type":"ContainerDied","Data":"da3ac36752b149cfd86a9d4c3f122fd5b20a4fe3b5014d72e8a56acc058b09b0"} Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.623640 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da3ac36752b149cfd86a9d4c3f122fd5b20a4fe3b5014d72e8a56acc058b09b0" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.623695 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.665531 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" podStartSLOduration=3.821170265 podStartE2EDuration="5.665516061s" podCreationTimestamp="2026-01-26 00:20:13 +0000 UTC" firstStartedPulling="2026-01-26 00:20:14.575523077 +0000 UTC m=+789.309629624" lastFinishedPulling="2026-01-26 00:20:16.419868883 +0000 UTC m=+791.153975420" observedRunningTime="2026-01-26 00:20:18.664233345 +0000 UTC m=+793.398339882" watchObservedRunningTime="2026-01-26 00:20:18.665516061 +0000 UTC m=+793.399622598" Jan 26 00:20:18 crc kubenswrapper[4874]: I0126 00:20:18.668554 4874 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a29bf9c8-7825-4a6e-9ed4-84c1da217e0f-util\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:19 crc kubenswrapper[4874]: I0126 00:20:19.240135 4874 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qb9r8" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="registry-server" probeResult="failure" output=< Jan 26 00:20:19 crc kubenswrapper[4874]: timeout: failed to connect service ":50051" within 1s Jan 26 00:20:19 crc kubenswrapper[4874]: > Jan 26 00:20:19 crc kubenswrapper[4874]: I0126 00:20:19.726872 4874 generic.go:334] "Generic (PLEG): container finished" podID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerID="2b46402e53c8a6e57e1b4c11c327890b82b8c4774625f3283dee69e05567397d" exitCode=0 Jan 26 00:20:19 crc kubenswrapper[4874]: I0126 00:20:19.726926 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" event={"ID":"1a4e3630-d9f6-442f-9afb-4dff66861ece","Type":"ContainerDied","Data":"2b46402e53c8a6e57e1b4c11c327890b82b8c4774625f3283dee69e05567397d"} Jan 26 00:20:19 crc kubenswrapper[4874]: I0126 00:20:19.729620 4874 generic.go:334] "Generic (PLEG): container finished" podID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerID="796f1882158d7d2be0763b489c8a1a4eb9c0abdece1fd3723646a560d8743ce5" exitCode=0 Jan 26 00:20:19 crc kubenswrapper[4874]: I0126 00:20:19.729644 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxmhq" event={"ID":"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc","Type":"ContainerDied","Data":"796f1882158d7d2be0763b489c8a1a4eb9c0abdece1fd3723646a560d8743ce5"} Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.189753 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt"] Jan 26 00:20:21 crc kubenswrapper[4874]: E0126 00:20:21.190492 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerName="pull" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.190507 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerName="pull" Jan 26 00:20:21 crc kubenswrapper[4874]: E0126 00:20:21.190519 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerName="extract" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.190526 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerName="extract" Jan 26 00:20:21 crc kubenswrapper[4874]: E0126 00:20:21.190541 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerName="util" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.190547 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerName="util" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.190655 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29bf9c8-7825-4a6e-9ed4-84c1da217e0f" containerName="extract" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.191573 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.223039 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt"] Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.303765 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.303831 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.303853 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pcbs\" (UniqueName: \"kubernetes.io/projected/320748b6-5b88-4e31-b9e6-56263e13c488-kube-api-access-2pcbs\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.412210 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.412263 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.412299 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pcbs\" (UniqueName: \"kubernetes.io/projected/320748b6-5b88-4e31-b9e6-56263e13c488-kube-api-access-2pcbs\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.413306 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.413635 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.452391 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pcbs\" (UniqueName: \"kubernetes.io/projected/320748b6-5b88-4e31-b9e6-56263e13c488-kube-api-access-2pcbs\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.467211 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.511128 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.513383 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rkh2\" (UniqueName: \"kubernetes.io/projected/1a4e3630-d9f6-442f-9afb-4dff66861ece-kube-api-access-5rkh2\") pod \"1a4e3630-d9f6-442f-9afb-4dff66861ece\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.514224 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-bundle\") pod \"1a4e3630-d9f6-442f-9afb-4dff66861ece\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.514360 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-util\") pod \"1a4e3630-d9f6-442f-9afb-4dff66861ece\" (UID: \"1a4e3630-d9f6-442f-9afb-4dff66861ece\") " Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.517413 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-bundle" (OuterVolumeSpecName: "bundle") pod "1a4e3630-d9f6-442f-9afb-4dff66861ece" (UID: "1a4e3630-d9f6-442f-9afb-4dff66861ece"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.525061 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-util" (OuterVolumeSpecName: "util") pod "1a4e3630-d9f6-442f-9afb-4dff66861ece" (UID: "1a4e3630-d9f6-442f-9afb-4dff66861ece"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.556175 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4e3630-d9f6-442f-9afb-4dff66861ece-kube-api-access-5rkh2" (OuterVolumeSpecName: "kube-api-access-5rkh2") pod "1a4e3630-d9f6-442f-9afb-4dff66861ece" (UID: "1a4e3630-d9f6-442f-9afb-4dff66861ece"). InnerVolumeSpecName "kube-api-access-5rkh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.702913 4874 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.706010 4874 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a4e3630-d9f6-442f-9afb-4dff66861ece-util\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.706158 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rkh2\" (UniqueName: \"kubernetes.io/projected/1a4e3630-d9f6-442f-9afb-4dff66861ece-kube-api-access-5rkh2\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.811343 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxmhq" event={"ID":"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc","Type":"ContainerStarted","Data":"4b7987d98bb4280f079c2e479cd30bd772547018cded8d690f8acef69a9bea30"} Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.940343 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" event={"ID":"1a4e3630-d9f6-442f-9afb-4dff66861ece","Type":"ContainerDied","Data":"800320e3300400b5052e3d48e61b940f3f9d892db8c0f4e21d39e11fe67d05dd"} Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.940386 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800320e3300400b5052e3d48e61b940f3f9d892db8c0f4e21d39e11fe67d05dd" Jan 26 00:20:21 crc kubenswrapper[4874]: I0126 00:20:21.940464 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs" Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.077601 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt"] Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.444823 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.445212 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.946686 4874 generic.go:334] "Generic (PLEG): container finished" podID="320748b6-5b88-4e31-b9e6-56263e13c488" containerID="c9ba2f639bffeadb86c094e4d88b2453b8fc3b075b56e5b3a66723e82889fb98" exitCode=0 Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.946790 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" event={"ID":"320748b6-5b88-4e31-b9e6-56263e13c488","Type":"ContainerDied","Data":"c9ba2f639bffeadb86c094e4d88b2453b8fc3b075b56e5b3a66723e82889fb98"} Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.947313 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" event={"ID":"320748b6-5b88-4e31-b9e6-56263e13c488","Type":"ContainerStarted","Data":"09c7ba0d34c82061852cd97bdfb159217275913c2fc61703787ccf485586a207"} Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.952600 4874 generic.go:334] "Generic (PLEG): container finished" podID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerID="4b7987d98bb4280f079c2e479cd30bd772547018cded8d690f8acef69a9bea30" exitCode=0 Jan 26 00:20:22 crc kubenswrapper[4874]: I0126 00:20:22.952660 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxmhq" event={"ID":"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc","Type":"ContainerDied","Data":"4b7987d98bb4280f079c2e479cd30bd772547018cded8d690f8acef69a9bea30"} Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.306646 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9"] Jan 26 00:20:23 crc kubenswrapper[4874]: E0126 00:20:23.307062 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerName="pull" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.307088 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerName="pull" Jan 26 00:20:23 crc kubenswrapper[4874]: E0126 00:20:23.307110 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerName="extract" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.307122 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerName="extract" Jan 26 00:20:23 crc kubenswrapper[4874]: E0126 00:20:23.307139 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerName="util" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.307148 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerName="util" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.307310 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4e3630-d9f6-442f-9afb-4dff66861ece" containerName="extract" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.307888 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.309846 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.309991 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.310184 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-562d6" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.315699 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.432994 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.433053 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfcg2\" (UniqueName: \"kubernetes.io/projected/e95db833-3b9b-48a8-8e46-6bc5701b400e-kube-api-access-tfcg2\") pod \"obo-prometheus-operator-68bc856cb9-nn5n9\" (UID: \"e95db833-3b9b-48a8-8e46-6bc5701b400e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.433641 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.435437 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fskxw" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.435788 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.444819 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.446215 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.449816 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.515149 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.537941 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfcg2\" (UniqueName: \"kubernetes.io/projected/e95db833-3b9b-48a8-8e46-6bc5701b400e-kube-api-access-tfcg2\") pod \"obo-prometheus-operator-68bc856cb9-nn5n9\" (UID: \"e95db833-3b9b-48a8-8e46-6bc5701b400e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.538032 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/524662a5-1e5c-490a-b16c-e3c281242e8c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc\" (UID: \"524662a5-1e5c-490a-b16c-e3c281242e8c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.538057 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70482ca5-c3cf-44cc-bd48-b0bac6edf082-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2\" (UID: \"70482ca5-c3cf-44cc-bd48-b0bac6edf082\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.538111 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/524662a5-1e5c-490a-b16c-e3c281242e8c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc\" (UID: \"524662a5-1e5c-490a-b16c-e3c281242e8c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.538135 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70482ca5-c3cf-44cc-bd48-b0bac6edf082-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2\" (UID: \"70482ca5-c3cf-44cc-bd48-b0bac6edf082\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.600606 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfcg2\" (UniqueName: \"kubernetes.io/projected/e95db833-3b9b-48a8-8e46-6bc5701b400e-kube-api-access-tfcg2\") pod \"obo-prometheus-operator-68bc856cb9-nn5n9\" (UID: \"e95db833-3b9b-48a8-8e46-6bc5701b400e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.639446 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/524662a5-1e5c-490a-b16c-e3c281242e8c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc\" (UID: \"524662a5-1e5c-490a-b16c-e3c281242e8c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.639501 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70482ca5-c3cf-44cc-bd48-b0bac6edf082-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2\" (UID: \"70482ca5-c3cf-44cc-bd48-b0bac6edf082\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.639555 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/524662a5-1e5c-490a-b16c-e3c281242e8c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc\" (UID: \"524662a5-1e5c-490a-b16c-e3c281242e8c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.639582 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70482ca5-c3cf-44cc-bd48-b0bac6edf082-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2\" (UID: \"70482ca5-c3cf-44cc-bd48-b0bac6edf082\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.646350 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70482ca5-c3cf-44cc-bd48-b0bac6edf082-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2\" (UID: \"70482ca5-c3cf-44cc-bd48-b0bac6edf082\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.650776 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70482ca5-c3cf-44cc-bd48-b0bac6edf082-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2\" (UID: \"70482ca5-c3cf-44cc-bd48-b0bac6edf082\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.651525 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/524662a5-1e5c-490a-b16c-e3c281242e8c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc\" (UID: \"524662a5-1e5c-490a-b16c-e3c281242e8c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.654926 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/524662a5-1e5c-490a-b16c-e3c281242e8c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc\" (UID: \"524662a5-1e5c-490a-b16c-e3c281242e8c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.667181 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.751793 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.770558 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.811808 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-rpfld"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.813056 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.816984 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-72mzs" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.817473 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.842599 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-rpfld"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.945524 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/771abf14-9afb-4281-b842-047d2cc449d4-observability-operator-tls\") pod \"observability-operator-59bdc8b94-rpfld\" (UID: \"771abf14-9afb-4281-b842-047d2cc449d4\") " pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.945604 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4n8t\" (UniqueName: \"kubernetes.io/projected/771abf14-9afb-4281-b842-047d2cc449d4-kube-api-access-l4n8t\") pod \"observability-operator-59bdc8b94-rpfld\" (UID: \"771abf14-9afb-4281-b842-047d2cc449d4\") " pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.967067 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-h2z2f"] Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.967820 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.973359 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-8t9kr" Jan 26 00:20:23 crc kubenswrapper[4874]: I0126 00:20:23.988358 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-h2z2f"] Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.050884 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4n8t\" (UniqueName: \"kubernetes.io/projected/771abf14-9afb-4281-b842-047d2cc449d4-kube-api-access-l4n8t\") pod \"observability-operator-59bdc8b94-rpfld\" (UID: \"771abf14-9afb-4281-b842-047d2cc449d4\") " pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.050982 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ce3769d-8607-45e8-afd2-84c971d29cb4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-h2z2f\" (UID: \"2ce3769d-8607-45e8-afd2-84c971d29cb4\") " pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.051043 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/771abf14-9afb-4281-b842-047d2cc449d4-observability-operator-tls\") pod \"observability-operator-59bdc8b94-rpfld\" (UID: \"771abf14-9afb-4281-b842-047d2cc449d4\") " pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.051071 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6dp\" (UniqueName: \"kubernetes.io/projected/2ce3769d-8607-45e8-afd2-84c971d29cb4-kube-api-access-qz6dp\") pod \"perses-operator-5bf474d74f-h2z2f\" (UID: \"2ce3769d-8607-45e8-afd2-84c971d29cb4\") " pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.057830 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/771abf14-9afb-4281-b842-047d2cc449d4-observability-operator-tls\") pod \"observability-operator-59bdc8b94-rpfld\" (UID: \"771abf14-9afb-4281-b842-047d2cc449d4\") " pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.078042 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4n8t\" (UniqueName: \"kubernetes.io/projected/771abf14-9afb-4281-b842-047d2cc449d4-kube-api-access-l4n8t\") pod \"observability-operator-59bdc8b94-rpfld\" (UID: \"771abf14-9afb-4281-b842-047d2cc449d4\") " pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.154858 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ce3769d-8607-45e8-afd2-84c971d29cb4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-h2z2f\" (UID: \"2ce3769d-8607-45e8-afd2-84c971d29cb4\") " pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.154971 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6dp\" (UniqueName: \"kubernetes.io/projected/2ce3769d-8607-45e8-afd2-84c971d29cb4-kube-api-access-qz6dp\") pod \"perses-operator-5bf474d74f-h2z2f\" (UID: \"2ce3769d-8607-45e8-afd2-84c971d29cb4\") " pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.157166 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ce3769d-8607-45e8-afd2-84c971d29cb4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-h2z2f\" (UID: \"2ce3769d-8607-45e8-afd2-84c971d29cb4\") " pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.182385 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6dp\" (UniqueName: \"kubernetes.io/projected/2ce3769d-8607-45e8-afd2-84c971d29cb4-kube-api-access-qz6dp\") pod \"perses-operator-5bf474d74f-h2z2f\" (UID: \"2ce3769d-8607-45e8-afd2-84c971d29cb4\") " pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.199711 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.228049 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2"] Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.231492 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc"] Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.295175 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9"] Jan 26 00:20:24 crc kubenswrapper[4874]: W0126 00:20:24.304312 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode95db833_3b9b_48a8_8e46_6bc5701b400e.slice/crio-4a8132924be89dfce06b6bc120ae0ddddef7fbf93bfbd744885ee6a45e1bcddc WatchSource:0}: Error finding container 4a8132924be89dfce06b6bc120ae0ddddef7fbf93bfbd744885ee6a45e1bcddc: Status 404 returned error can't find the container with id 4a8132924be89dfce06b6bc120ae0ddddef7fbf93bfbd744885ee6a45e1bcddc Jan 26 00:20:24 crc kubenswrapper[4874]: I0126 00:20:24.309741 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:24.510264 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-rpfld"] Jan 26 00:20:25 crc kubenswrapper[4874]: W0126 00:20:24.554309 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771abf14_9afb_4281_b842_047d2cc449d4.slice/crio-19ec0dfb92bb70c59ad6fcdf873ff3fc9bc5a17f4f6c4f1e44ddaf373d07e1d9 WatchSource:0}: Error finding container 19ec0dfb92bb70c59ad6fcdf873ff3fc9bc5a17f4f6c4f1e44ddaf373d07e1d9: Status 404 returned error can't find the container with id 19ec0dfb92bb70c59ad6fcdf873ff3fc9bc5a17f4f6c4f1e44ddaf373d07e1d9 Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:24.598063 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-h2z2f"] Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:25.043408 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" event={"ID":"524662a5-1e5c-490a-b16c-e3c281242e8c","Type":"ContainerStarted","Data":"ae41915e3bd990390cd50f91b3312008835f2d8dd5625e0293f1b3f9036177df"} Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:25.048586 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" event={"ID":"2ce3769d-8607-45e8-afd2-84c971d29cb4","Type":"ContainerStarted","Data":"2941bca1b8c348599fdf6b86008d2429c1eb8834049758dabbdc4d6340b8015f"} Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:25.060156 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxmhq" event={"ID":"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc","Type":"ContainerStarted","Data":"7476710e64938b5c532fb0132d4c8a43d924fa63e6433b98d0b1729b29056d9a"} Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:25.062557 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" event={"ID":"e95db833-3b9b-48a8-8e46-6bc5701b400e","Type":"ContainerStarted","Data":"4a8132924be89dfce06b6bc120ae0ddddef7fbf93bfbd744885ee6a45e1bcddc"} Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:25.066297 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" event={"ID":"771abf14-9afb-4281-b842-047d2cc449d4","Type":"ContainerStarted","Data":"19ec0dfb92bb70c59ad6fcdf873ff3fc9bc5a17f4f6c4f1e44ddaf373d07e1d9"} Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:25.070089 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" event={"ID":"70482ca5-c3cf-44cc-bd48-b0bac6edf082","Type":"ContainerStarted","Data":"54de03cd40736139d2d05d45f6e8bdff6f4a587d2e5929ba26457558c2591241"} Jan 26 00:20:25 crc kubenswrapper[4874]: I0126 00:20:25.086497 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bxmhq" podStartSLOduration=4.160649014 podStartE2EDuration="8.086475952s" podCreationTimestamp="2026-01-26 00:20:17 +0000 UTC" firstStartedPulling="2026-01-26 00:20:19.731268236 +0000 UTC m=+794.465374773" lastFinishedPulling="2026-01-26 00:20:23.657095174 +0000 UTC m=+798.391201711" observedRunningTime="2026-01-26 00:20:25.084039063 +0000 UTC m=+799.818145610" watchObservedRunningTime="2026-01-26 00:20:25.086475952 +0000 UTC m=+799.820582489" Jan 26 00:20:27 crc kubenswrapper[4874]: I0126 00:20:27.735007 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:27 crc kubenswrapper[4874]: I0126 00:20:27.735377 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:27 crc kubenswrapper[4874]: I0126 00:20:27.888331 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.460169 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.541654 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-w77pb"] Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.542502 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.544211 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbm9v\" (UniqueName: \"kubernetes.io/projected/db14db54-bfd4-410a-b715-f2b5de6e81c9-kube-api-access-bbm9v\") pod \"interconnect-operator-5bb49f789d-w77pb\" (UID: \"db14db54-bfd4-410a-b715-f2b5de6e81c9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.545895 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.546080 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-g9k2c" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.546119 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.562770 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-w77pb"] Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.638296 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.645162 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbm9v\" (UniqueName: \"kubernetes.io/projected/db14db54-bfd4-410a-b715-f2b5de6e81c9-kube-api-access-bbm9v\") pod \"interconnect-operator-5bb49f789d-w77pb\" (UID: \"db14db54-bfd4-410a-b715-f2b5de6e81c9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.695216 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbm9v\" (UniqueName: \"kubernetes.io/projected/db14db54-bfd4-410a-b715-f2b5de6e81c9-kube-api-access-bbm9v\") pod \"interconnect-operator-5bb49f789d-w77pb\" (UID: \"db14db54-bfd4-410a-b715-f2b5de6e81c9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" Jan 26 00:20:28 crc kubenswrapper[4874]: I0126 00:20:28.864228 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.563058 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qb9r8"] Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.564513 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qb9r8" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="registry-server" containerID="cri-o://46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c" gracePeriod=2 Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.731277 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-588fbc5f4d-bcx9n"] Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.737824 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-588fbc5f4d-bcx9n"] Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.738063 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.745207 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z4s7\" (UniqueName: \"kubernetes.io/projected/9ff52345-e08e-4d05-8754-c2915e43d9c0-kube-api-access-4z4s7\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.746569 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ff52345-e08e-4d05-8754-c2915e43d9c0-apiservice-cert\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.747009 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ff52345-e08e-4d05-8754-c2915e43d9c0-webhook-cert\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.749439 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.750860 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-4cvlr" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.847711 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ff52345-e08e-4d05-8754-c2915e43d9c0-webhook-cert\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.847768 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z4s7\" (UniqueName: \"kubernetes.io/projected/9ff52345-e08e-4d05-8754-c2915e43d9c0-kube-api-access-4z4s7\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.847807 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ff52345-e08e-4d05-8754-c2915e43d9c0-apiservice-cert\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.854758 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ff52345-e08e-4d05-8754-c2915e43d9c0-apiservice-cert\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.855180 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ff52345-e08e-4d05-8754-c2915e43d9c0-webhook-cert\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:32 crc kubenswrapper[4874]: I0126 00:20:32.883209 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z4s7\" (UniqueName: \"kubernetes.io/projected/9ff52345-e08e-4d05-8754-c2915e43d9c0-kube-api-access-4z4s7\") pod \"elastic-operator-588fbc5f4d-bcx9n\" (UID: \"9ff52345-e08e-4d05-8754-c2915e43d9c0\") " pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:33 crc kubenswrapper[4874]: I0126 00:20:33.072641 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" Jan 26 00:20:33 crc kubenswrapper[4874]: I0126 00:20:33.137253 4874 generic.go:334] "Generic (PLEG): container finished" podID="39304a23-bff0-4736-887e-766831f728a3" containerID="46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c" exitCode=0 Jan 26 00:20:33 crc kubenswrapper[4874]: I0126 00:20:33.137298 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qb9r8" event={"ID":"39304a23-bff0-4736-887e-766831f728a3","Type":"ContainerDied","Data":"46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c"} Jan 26 00:20:37 crc kubenswrapper[4874]: I0126 00:20:37.820383 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:38 crc kubenswrapper[4874]: E0126 00:20:38.063468 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c is running failed: container process not found" containerID="46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:20:38 crc kubenswrapper[4874]: E0126 00:20:38.075606 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c is running failed: container process not found" containerID="46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:20:38 crc kubenswrapper[4874]: E0126 00:20:38.078458 4874 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c is running failed: container process not found" containerID="46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c" cmd=["grpc_health_probe","-addr=:50051"] Jan 26 00:20:38 crc kubenswrapper[4874]: E0126 00:20:38.078554 4874 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-qb9r8" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="registry-server" Jan 26 00:20:41 crc kubenswrapper[4874]: I0126 00:20:41.319019 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxmhq"] Jan 26 00:20:41 crc kubenswrapper[4874]: I0126 00:20:41.319708 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bxmhq" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="registry-server" containerID="cri-o://7476710e64938b5c532fb0132d4c8a43d924fa63e6433b98d0b1729b29056d9a" gracePeriod=2 Jan 26 00:20:42 crc kubenswrapper[4874]: I0126 00:20:42.210465 4874 generic.go:334] "Generic (PLEG): container finished" podID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerID="7476710e64938b5c532fb0132d4c8a43d924fa63e6433b98d0b1729b29056d9a" exitCode=0 Jan 26 00:20:42 crc kubenswrapper[4874]: I0126 00:20:42.210934 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxmhq" event={"ID":"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc","Type":"ContainerDied","Data":"7476710e64938b5c532fb0132d4c8a43d924fa63e6433b98d0b1729b29056d9a"} Jan 26 00:20:43 crc kubenswrapper[4874]: E0126 00:20:43.670184 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c" Jan 26 00:20:43 crc kubenswrapper[4874]: E0126 00:20:43.670487 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:dc62889b883f597de91b5389cc52c84c607247d49a807693be2f688e4703dfc3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:e797cdb47beef40b04da7b6d645bca3dc32e6247003c45b56b38efd9e13bf01c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:093d2731ac848ed5fd57356b155a19d3bf7b8db96d95b09c5d0095e143f7254f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:7d662a120305e2528acc7e9142b770b5b6a7f4932ddfcadfa4ac953935124895,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:75465aabb0aa427a5c531a8fcde463f6d119afbcc618ebcbf6b7ee9bc8aad160,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:dc18c8d6a4a9a0a574a57cc5082c8a9b26023bd6d69b9732892d584c1dfe5070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:369729978cecdc13c99ef3d179f8eb8a450a4a0cb70b63c27a55a15d1710ba27,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:d8c7a61d147f62b204d5c5f16864386025393453c9a81ea327bbd25d7765d611,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:b4a6eb1cc118a4334b424614959d8b7f361ddd779b3a72690ca49b0a3f26d9b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:21d4fff670893ba4b7fbc528cd49f8b71c8281cede9ef84f0697065bb6a7fc50,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:12d9dbe297a1c3b9df671f21156992082bc483887d851fafe76e5d17321ff474,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:e65c37f04f6d76a0cbfe05edb3cddf6a8f14f859ee35cf3aebea8fcb991d2c19,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:48e4e178c6eeaa9d5dd77a591c185a311b4b4a5caadb7199d48463123e31dc9e,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4n8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-59bdc8b94-rpfld_openshift-operators(771abf14-9afb-4281-b842-047d2cc449d4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:20:43 crc kubenswrapper[4874]: E0126 00:20:43.671706 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" podUID="771abf14-9afb-4281-b842-047d2cc449d4" Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.731055 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.786231 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-catalog-content\") pod \"39304a23-bff0-4736-887e-766831f728a3\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.786807 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-utilities\") pod \"39304a23-bff0-4736-887e-766831f728a3\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.786859 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cns5b\" (UniqueName: \"kubernetes.io/projected/39304a23-bff0-4736-887e-766831f728a3-kube-api-access-cns5b\") pod \"39304a23-bff0-4736-887e-766831f728a3\" (UID: \"39304a23-bff0-4736-887e-766831f728a3\") " Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.797824 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-utilities" (OuterVolumeSpecName: "utilities") pod "39304a23-bff0-4736-887e-766831f728a3" (UID: "39304a23-bff0-4736-887e-766831f728a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.798881 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39304a23-bff0-4736-887e-766831f728a3-kube-api-access-cns5b" (OuterVolumeSpecName: "kube-api-access-cns5b") pod "39304a23-bff0-4736-887e-766831f728a3" (UID: "39304a23-bff0-4736-887e-766831f728a3"). InnerVolumeSpecName "kube-api-access-cns5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.889799 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cns5b\" (UniqueName: \"kubernetes.io/projected/39304a23-bff0-4736-887e-766831f728a3-kube-api-access-cns5b\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:43 crc kubenswrapper[4874]: I0126 00:20:43.889834 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.071859 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39304a23-bff0-4736-887e-766831f728a3" (UID: "39304a23-bff0-4736-887e-766831f728a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.093247 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39304a23-bff0-4736-887e-766831f728a3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.170600 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.246655 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" event={"ID":"320748b6-5b88-4e31-b9e6-56263e13c488","Type":"ContainerStarted","Data":"14693bea9224aa7b0686a339fe572ec757aca739424e2810bd3822fdb0b2c024"} Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.248296 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" event={"ID":"70482ca5-c3cf-44cc-bd48-b0bac6edf082","Type":"ContainerStarted","Data":"64ebb8d3c1589c1f7c865d406d52f477f80bd92a549a9de83cd5c91053f04eef"} Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.250562 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" event={"ID":"524662a5-1e5c-490a-b16c-e3c281242e8c","Type":"ContainerStarted","Data":"b7bf75c644cdf4fc0b9347edd6712f13b891c8d6d3783748a8e2752a5f8c38f1"} Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.256487 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-588fbc5f4d-bcx9n"] Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.264149 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qb9r8" event={"ID":"39304a23-bff0-4736-887e-766831f728a3","Type":"ContainerDied","Data":"265d5bc206ef053698657e1b6276f28e894dc4d74e235154c942e7f6f082c419"} Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.264207 4874 scope.go:117] "RemoveContainer" containerID="46fa1b71994286163a1169d293bdabcd086f5ee5fcfb40ea5ea021b599014c5c" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.264211 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qb9r8" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.279720 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" event={"ID":"2ce3769d-8607-45e8-afd2-84c971d29cb4","Type":"ContainerStarted","Data":"eeb5aeedec3abc35f2f233491e9d96991fa8ef599817a3053ac2aeac8c2aafd5"} Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.281142 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.292557 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2" podStartSLOduration=1.791803754 podStartE2EDuration="21.292539533s" podCreationTimestamp="2026-01-26 00:20:23 +0000 UTC" firstStartedPulling="2026-01-26 00:20:24.270195951 +0000 UTC m=+799.004302488" lastFinishedPulling="2026-01-26 00:20:43.77093173 +0000 UTC m=+818.505038267" observedRunningTime="2026-01-26 00:20:44.29171558 +0000 UTC m=+819.025822137" watchObservedRunningTime="2026-01-26 00:20:44.292539533 +0000 UTC m=+819.026646070" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.296889 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-catalog-content\") pod \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.298033 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-utilities\") pod \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.298123 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5wln\" (UniqueName: \"kubernetes.io/projected/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-kube-api-access-x5wln\") pod \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\" (UID: \"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc\") " Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.298996 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-utilities" (OuterVolumeSpecName: "utilities") pod "ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" (UID: "ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.304251 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxmhq" event={"ID":"ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc","Type":"ContainerDied","Data":"c56e94e3561031539a772abe721b3ce98f78772520a18ae49d4c0b4ea85211b5"} Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.304647 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxmhq" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.339058 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" event={"ID":"e95db833-3b9b-48a8-8e46-6bc5701b400e","Type":"ContainerStarted","Data":"476b64a534c381f80da7f37fe6be04ffd6183e59ee5a7f192319d6f6b573fc3c"} Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.349600 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-kube-api-access-x5wln" (OuterVolumeSpecName: "kube-api-access-x5wln") pod "ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" (UID: "ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc"). InnerVolumeSpecName "kube-api-access-x5wln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:20:44 crc kubenswrapper[4874]: E0126 00:20:44.350117 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c\\\"\"" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" podUID="771abf14-9afb-4281-b842-047d2cc449d4" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.350470 4874 scope.go:117] "RemoveContainer" containerID="21fee66d62e288d372731064695fbbb52964e8e240c526e4ddbe5f638ba637af" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.382138 4874 scope.go:117] "RemoveContainer" containerID="44cec63b3d97762dd3f1e3ac80dc38db6b0e3f82ee15d889ae324a64c18d1fbd" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.396212 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc" podStartSLOduration=1.888151717 podStartE2EDuration="21.396196993s" podCreationTimestamp="2026-01-26 00:20:23 +0000 UTC" firstStartedPulling="2026-01-26 00:20:24.270660145 +0000 UTC m=+799.004766682" lastFinishedPulling="2026-01-26 00:20:43.778705421 +0000 UTC m=+818.512811958" observedRunningTime="2026-01-26 00:20:44.325763666 +0000 UTC m=+819.059870203" watchObservedRunningTime="2026-01-26 00:20:44.396196993 +0000 UTC m=+819.130303530" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.399247 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.399329 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5wln\" (UniqueName: \"kubernetes.io/projected/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-kube-api-access-x5wln\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.400815 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" podStartSLOduration=2.251009508 podStartE2EDuration="21.400806714s" podCreationTimestamp="2026-01-26 00:20:23 +0000 UTC" firstStartedPulling="2026-01-26 00:20:24.62028517 +0000 UTC m=+799.354391707" lastFinishedPulling="2026-01-26 00:20:43.770082376 +0000 UTC m=+818.504188913" observedRunningTime="2026-01-26 00:20:44.393427465 +0000 UTC m=+819.127534022" watchObservedRunningTime="2026-01-26 00:20:44.400806714 +0000 UTC m=+819.134913251" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.406502 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" (UID: "ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.409419 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-w77pb"] Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.431791 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qb9r8"] Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.436993 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qb9r8"] Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.450287 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-nn5n9" podStartSLOduration=1.987261108 podStartE2EDuration="21.450269947s" podCreationTimestamp="2026-01-26 00:20:23 +0000 UTC" firstStartedPulling="2026-01-26 00:20:24.307720656 +0000 UTC m=+799.041827193" lastFinishedPulling="2026-01-26 00:20:43.770729495 +0000 UTC m=+818.504836032" observedRunningTime="2026-01-26 00:20:44.448475126 +0000 UTC m=+819.182581663" watchObservedRunningTime="2026-01-26 00:20:44.450269947 +0000 UTC m=+819.184376474" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.456105 4874 scope.go:117] "RemoveContainer" containerID="7476710e64938b5c532fb0132d4c8a43d924fa63e6433b98d0b1729b29056d9a" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.478891 4874 scope.go:117] "RemoveContainer" containerID="4b7987d98bb4280f079c2e479cd30bd772547018cded8d690f8acef69a9bea30" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.496516 4874 scope.go:117] "RemoveContainer" containerID="796f1882158d7d2be0763b489c8a1a4eb9c0abdece1fd3723646a560d8743ce5" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.500570 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.639454 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxmhq"] Jan 26 00:20:44 crc kubenswrapper[4874]: I0126 00:20:44.643297 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bxmhq"] Jan 26 00:20:45 crc kubenswrapper[4874]: I0126 00:20:45.346943 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" event={"ID":"db14db54-bfd4-410a-b715-f2b5de6e81c9","Type":"ContainerStarted","Data":"034078cff0762bc7b64142888c472417be47caf906c06dbcbf2de57f1d6067c7"} Jan 26 00:20:45 crc kubenswrapper[4874]: I0126 00:20:45.354007 4874 generic.go:334] "Generic (PLEG): container finished" podID="320748b6-5b88-4e31-b9e6-56263e13c488" containerID="14693bea9224aa7b0686a339fe572ec757aca739424e2810bd3822fdb0b2c024" exitCode=0 Jan 26 00:20:45 crc kubenswrapper[4874]: I0126 00:20:45.354074 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" event={"ID":"320748b6-5b88-4e31-b9e6-56263e13c488","Type":"ContainerDied","Data":"14693bea9224aa7b0686a339fe572ec757aca739424e2810bd3822fdb0b2c024"} Jan 26 00:20:45 crc kubenswrapper[4874]: I0126 00:20:45.361072 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" event={"ID":"9ff52345-e08e-4d05-8754-c2915e43d9c0","Type":"ContainerStarted","Data":"fda8cf704458e006c0dc23d650f97cb03e7b502c3b6c9dea9c9d43a2586b4ca0"} Jan 26 00:20:45 crc kubenswrapper[4874]: I0126 00:20:45.627605 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39304a23-bff0-4736-887e-766831f728a3" path="/var/lib/kubelet/pods/39304a23-bff0-4736-887e-766831f728a3/volumes" Jan 26 00:20:45 crc kubenswrapper[4874]: I0126 00:20:45.628616 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" path="/var/lib/kubelet/pods/ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc/volumes" Jan 26 00:20:46 crc kubenswrapper[4874]: I0126 00:20:46.391882 4874 generic.go:334] "Generic (PLEG): container finished" podID="320748b6-5b88-4e31-b9e6-56263e13c488" containerID="7ab75c7288fca54cc5151faa3c47d2ad0dcaf883ba169b69855577aabdee9b63" exitCode=0 Jan 26 00:20:46 crc kubenswrapper[4874]: I0126 00:20:46.392085 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" event={"ID":"320748b6-5b88-4e31-b9e6-56263e13c488","Type":"ContainerDied","Data":"7ab75c7288fca54cc5151faa3c47d2ad0dcaf883ba169b69855577aabdee9b63"} Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.688998 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.801332 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pcbs\" (UniqueName: \"kubernetes.io/projected/320748b6-5b88-4e31-b9e6-56263e13c488-kube-api-access-2pcbs\") pod \"320748b6-5b88-4e31-b9e6-56263e13c488\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.801558 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-bundle\") pod \"320748b6-5b88-4e31-b9e6-56263e13c488\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.801687 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-util\") pod \"320748b6-5b88-4e31-b9e6-56263e13c488\" (UID: \"320748b6-5b88-4e31-b9e6-56263e13c488\") " Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.804239 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-bundle" (OuterVolumeSpecName: "bundle") pod "320748b6-5b88-4e31-b9e6-56263e13c488" (UID: "320748b6-5b88-4e31-b9e6-56263e13c488"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.809652 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320748b6-5b88-4e31-b9e6-56263e13c488-kube-api-access-2pcbs" (OuterVolumeSpecName: "kube-api-access-2pcbs") pod "320748b6-5b88-4e31-b9e6-56263e13c488" (UID: "320748b6-5b88-4e31-b9e6-56263e13c488"). InnerVolumeSpecName "kube-api-access-2pcbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.820562 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-util" (OuterVolumeSpecName: "util") pod "320748b6-5b88-4e31-b9e6-56263e13c488" (UID: "320748b6-5b88-4e31-b9e6-56263e13c488"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.903273 4874 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-util\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.903738 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pcbs\" (UniqueName: \"kubernetes.io/projected/320748b6-5b88-4e31-b9e6-56263e13c488-kube-api-access-2pcbs\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:48 crc kubenswrapper[4874]: I0126 00:20:48.903753 4874 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320748b6-5b88-4e31-b9e6-56263e13c488-bundle\") on node \"crc\" DevicePath \"\"" Jan 26 00:20:49 crc kubenswrapper[4874]: I0126 00:20:49.430829 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" event={"ID":"9ff52345-e08e-4d05-8754-c2915e43d9c0","Type":"ContainerStarted","Data":"be1348fb716a88bcf367819eb4d0d319cc60e174700ddc3d3bf8acb0a2d52b59"} Jan 26 00:20:49 crc kubenswrapper[4874]: I0126 00:20:49.435473 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" event={"ID":"320748b6-5b88-4e31-b9e6-56263e13c488","Type":"ContainerDied","Data":"09c7ba0d34c82061852cd97bdfb159217275913c2fc61703787ccf485586a207"} Jan 26 00:20:49 crc kubenswrapper[4874]: I0126 00:20:49.435514 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c7ba0d34c82061852cd97bdfb159217275913c2fc61703787ccf485586a207" Jan 26 00:20:49 crc kubenswrapper[4874]: I0126 00:20:49.435575 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt" Jan 26 00:20:49 crc kubenswrapper[4874]: I0126 00:20:49.455669 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-588fbc5f4d-bcx9n" podStartSLOduration=13.058490657 podStartE2EDuration="17.455648241s" podCreationTimestamp="2026-01-26 00:20:32 +0000 UTC" firstStartedPulling="2026-01-26 00:20:44.292341708 +0000 UTC m=+819.026448245" lastFinishedPulling="2026-01-26 00:20:48.689499292 +0000 UTC m=+823.423605829" observedRunningTime="2026-01-26 00:20:49.453276494 +0000 UTC m=+824.187383041" watchObservedRunningTime="2026-01-26 00:20:49.455648241 +0000 UTC m=+824.189754778" Jan 26 00:20:52 crc kubenswrapper[4874]: I0126 00:20:52.443906 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:20:52 crc kubenswrapper[4874]: I0126 00:20:52.445741 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:20:52 crc kubenswrapper[4874]: I0126 00:20:52.446111 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:20:52 crc kubenswrapper[4874]: I0126 00:20:52.447162 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80cb2c7272f4e2f3e98351d6918466cfb5e254dd1e4e8cac8003d20db9e8a1cb"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:20:52 crc kubenswrapper[4874]: I0126 00:20:52.447225 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://80cb2c7272f4e2f3e98351d6918466cfb5e254dd1e4e8cac8003d20db9e8a1cb" gracePeriod=600 Jan 26 00:20:53 crc kubenswrapper[4874]: I0126 00:20:53.474910 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="80cb2c7272f4e2f3e98351d6918466cfb5e254dd1e4e8cac8003d20db9e8a1cb" exitCode=0 Jan 26 00:20:53 crc kubenswrapper[4874]: I0126 00:20:53.474990 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"80cb2c7272f4e2f3e98351d6918466cfb5e254dd1e4e8cac8003d20db9e8a1cb"} Jan 26 00:20:53 crc kubenswrapper[4874]: I0126 00:20:53.475381 4874 scope.go:117] "RemoveContainer" containerID="ae666aaf2203751960dd931caf5d4928ee70c64a2a4871227850c0594dcd70ab" Jan 26 00:20:54 crc kubenswrapper[4874]: I0126 00:20:54.313456 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-h2z2f" Jan 26 00:20:54 crc kubenswrapper[4874]: I0126 00:20:54.482556 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" event={"ID":"db14db54-bfd4-410a-b715-f2b5de6e81c9","Type":"ContainerStarted","Data":"c41b2799abb186aaa22198406d71d15403a35d31ee51695dff35451f98450d4b"} Jan 26 00:20:54 crc kubenswrapper[4874]: I0126 00:20:54.484694 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"3ee75d759155b8832b753c645587cef74c69208f7e96da35246abb4b342d40fa"} Jan 26 00:20:54 crc kubenswrapper[4874]: I0126 00:20:54.528185 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-w77pb" podStartSLOduration=17.520027217 podStartE2EDuration="26.52815526s" podCreationTimestamp="2026-01-26 00:20:28 +0000 UTC" firstStartedPulling="2026-01-26 00:20:44.455872186 +0000 UTC m=+819.189978723" lastFinishedPulling="2026-01-26 00:20:53.464000229 +0000 UTC m=+828.198106766" observedRunningTime="2026-01-26 00:20:54.501849084 +0000 UTC m=+829.235955621" watchObservedRunningTime="2026-01-26 00:20:54.52815526 +0000 UTC m=+829.262261797" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153446 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153680 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="extract-utilities" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153698 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="extract-utilities" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153710 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="registry-server" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153717 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="registry-server" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153727 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320748b6-5b88-4e31-b9e6-56263e13c488" containerName="util" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153735 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="320748b6-5b88-4e31-b9e6-56263e13c488" containerName="util" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153741 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="registry-server" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153746 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="registry-server" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153757 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320748b6-5b88-4e31-b9e6-56263e13c488" containerName="extract" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153763 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="320748b6-5b88-4e31-b9e6-56263e13c488" containerName="extract" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153772 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="extract-content" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153778 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="extract-content" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153787 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="extract-content" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153794 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="extract-content" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153804 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320748b6-5b88-4e31-b9e6-56263e13c488" containerName="pull" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153810 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="320748b6-5b88-4e31-b9e6-56263e13c488" containerName="pull" Jan 26 00:20:55 crc kubenswrapper[4874]: E0126 00:20:55.153823 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="extract-utilities" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153828 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="extract-utilities" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153912 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="320748b6-5b88-4e31-b9e6-56263e13c488" containerName="extract" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153929 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="39304a23-bff0-4736-887e-766831f728a3" containerName="registry-server" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.153938 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfb9c39-f468-4ddf-bb92-f6b4be4c80bc" containerName="registry-server" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.154686 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.158363 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-lbqq5" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.158540 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.158767 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.159647 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.159640 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.159860 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.160424 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.160875 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.162626 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.184029 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.311978 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312055 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312099 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312127 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312157 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312214 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312246 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312278 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312307 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312350 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/74e109ac-bfdd-4515-b3e3-3140c7372844-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312373 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312406 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312437 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312472 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.312498 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.413760 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414372 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414405 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414440 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414493 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414523 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414675 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414715 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414758 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/74e109ac-bfdd-4515-b3e3-3140c7372844-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414785 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414815 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414850 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414880 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414905 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.414935 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.415988 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.416042 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.416327 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.416584 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.421620 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.451880 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.452120 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.452217 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.452779 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/74e109ac-bfdd-4515-b3e3-3140c7372844-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.453316 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/74e109ac-bfdd-4515-b3e3-3140c7372844-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.454184 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.455677 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.478942 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.479694 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.506230 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/74e109ac-bfdd-4515-b3e3-3140c7372844-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"74e109ac-bfdd-4515-b3e3-3140c7372844\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:55 crc kubenswrapper[4874]: I0126 00:20:55.777402 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:20:56 crc kubenswrapper[4874]: I0126 00:20:56.060708 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 26 00:20:56 crc kubenswrapper[4874]: W0126 00:20:56.072458 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e109ac_bfdd_4515_b3e3_3140c7372844.slice/crio-8bdf7d9190df0c811a398cb91387d548795a7f8c387c580ad0119b60f7f9edbd WatchSource:0}: Error finding container 8bdf7d9190df0c811a398cb91387d548795a7f8c387c580ad0119b60f7f9edbd: Status 404 returned error can't find the container with id 8bdf7d9190df0c811a398cb91387d548795a7f8c387c580ad0119b60f7f9edbd Jan 26 00:20:56 crc kubenswrapper[4874]: I0126 00:20:56.511754 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"74e109ac-bfdd-4515-b3e3-3140c7372844","Type":"ContainerStarted","Data":"8bdf7d9190df0c811a398cb91387d548795a7f8c387c580ad0119b60f7f9edbd"} Jan 26 00:20:56 crc kubenswrapper[4874]: I0126 00:20:56.515165 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" event={"ID":"771abf14-9afb-4281-b842-047d2cc449d4","Type":"ContainerStarted","Data":"eb0749aaa5b18d14c4737976ccb505fbce1f93cd4ab2c42ca51b3651117ecc80"} Jan 26 00:20:56 crc kubenswrapper[4874]: I0126 00:20:56.516362 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:56 crc kubenswrapper[4874]: I0126 00:20:56.577490 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" Jan 26 00:20:56 crc kubenswrapper[4874]: I0126 00:20:56.604811 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-rpfld" podStartSLOduration=2.697667616 podStartE2EDuration="33.604792904s" podCreationTimestamp="2026-01-26 00:20:23 +0000 UTC" firstStartedPulling="2026-01-26 00:20:24.563178041 +0000 UTC m=+799.297284578" lastFinishedPulling="2026-01-26 00:20:55.470303329 +0000 UTC m=+830.204409866" observedRunningTime="2026-01-26 00:20:56.547473668 +0000 UTC m=+831.281580205" watchObservedRunningTime="2026-01-26 00:20:56.604792904 +0000 UTC m=+831.338899441" Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.834894 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb"] Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.838264 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.850692 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.850919 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.851028 4874 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-w22br" Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.871939 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb"] Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.996807 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bh8\" (UniqueName: \"kubernetes.io/projected/10d3aa7e-cb4b-4e86-8c38-1844f024c3b7-kube-api-access-96bh8\") pod \"cert-manager-operator-controller-manager-5446d6888b-dxcpb\" (UID: \"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:04 crc kubenswrapper[4874]: I0126 00:21:04.996880 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10d3aa7e-cb4b-4e86-8c38-1844f024c3b7-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-dxcpb\" (UID: \"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:05 crc kubenswrapper[4874]: I0126 00:21:05.098366 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96bh8\" (UniqueName: \"kubernetes.io/projected/10d3aa7e-cb4b-4e86-8c38-1844f024c3b7-kube-api-access-96bh8\") pod \"cert-manager-operator-controller-manager-5446d6888b-dxcpb\" (UID: \"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:05 crc kubenswrapper[4874]: I0126 00:21:05.098422 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10d3aa7e-cb4b-4e86-8c38-1844f024c3b7-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-dxcpb\" (UID: \"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:05 crc kubenswrapper[4874]: I0126 00:21:05.099072 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10d3aa7e-cb4b-4e86-8c38-1844f024c3b7-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-dxcpb\" (UID: \"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:05 crc kubenswrapper[4874]: I0126 00:21:05.125497 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bh8\" (UniqueName: \"kubernetes.io/projected/10d3aa7e-cb4b-4e86-8c38-1844f024c3b7-kube-api-access-96bh8\") pod \"cert-manager-operator-controller-manager-5446d6888b-dxcpb\" (UID: \"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:05 crc kubenswrapper[4874]: I0126 00:21:05.215747 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" Jan 26 00:21:06 crc kubenswrapper[4874]: I0126 00:21:06.815329 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb"] Jan 26 00:21:06 crc kubenswrapper[4874]: W0126 00:21:06.833030 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d3aa7e_cb4b_4e86_8c38_1844f024c3b7.slice/crio-f2fde97743f5a7d1afc3d05142a63e27f35c13cf1a089631869b2ee2e4debdb1 WatchSource:0}: Error finding container f2fde97743f5a7d1afc3d05142a63e27f35c13cf1a089631869b2ee2e4debdb1: Status 404 returned error can't find the container with id f2fde97743f5a7d1afc3d05142a63e27f35c13cf1a089631869b2ee2e4debdb1 Jan 26 00:21:07 crc kubenswrapper[4874]: I0126 00:21:07.716751 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" event={"ID":"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7","Type":"ContainerStarted","Data":"f2fde97743f5a7d1afc3d05142a63e27f35c13cf1a089631869b2ee2e4debdb1"} Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.519977 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.522899 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.528635 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-pt6zh" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.528902 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.529131 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.529165 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.531789 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639349 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639407 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639427 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639457 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639476 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639549 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639588 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639611 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639627 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639672 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6j6\" (UniqueName: \"kubernetes.io/projected/9ed367b7-643b-419e-a14a-df6cd386156c-kube-api-access-8x6j6\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639709 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.639736 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.741166 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.741247 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.741280 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.741359 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.741392 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742039 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742132 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742182 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742196 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742311 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742352 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742433 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742459 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6j6\" (UniqueName: \"kubernetes.io/projected/9ed367b7-643b-419e-a14a-df6cd386156c-kube-api-access-8x6j6\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742479 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742637 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742778 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.742922 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.743237 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.743829 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.743887 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.745003 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.750447 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.757227 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.812138 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6j6\" (UniqueName: \"kubernetes.io/projected/9ed367b7-643b-419e-a14a-df6cd386156c-kube-api-access-8x6j6\") pod \"service-telemetry-operator-1-build\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:21 crc kubenswrapper[4874]: I0126 00:21:21.853991 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.564618 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/cert-manager-operator-rhel9@sha256:fa8de363ab4435c1085ac37f1bad488828c6ae8ba361c5f865c27ef577610911" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.566035 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-operator,Image:registry.redhat.io/cert-manager/cert-manager-operator-rhel9@sha256:fa8de363ab4435c1085ac37f1bad488828c6ae8ba361c5f865c27ef577610911,Command:[/usr/bin/cert-manager-operator],Args:[start --v=$(OPERATOR_LOG_LEVEL) --trusted-ca-configmap=$(TRUSTED_CA_CONFIGMAP_NAME) --cloud-credentials-secret=$(CLOUD_CREDENTIALS_SECRET_NAME) --unsupported-addon-features=$(UNSUPPORTED_ADDON_FEATURES)],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:cert-manager-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_WEBHOOK,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_CA_INJECTOR,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_CONTROLLER,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_ACMESOLVER,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-acmesolver-rhel9@sha256:ba937fc4b9eee31422914352c11a45b90754ba4fbe490ea45249b90afdc4e0a7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_ISTIOCSR,Value:registry.redhat.io/cert-manager/cert-manager-istio-csr-rhel9@sha256:af1ac813b8ee414ef215936f05197bc498bccbd540f3e2a93cb522221ba112bc,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.18.3,ValueFrom:nil,},EnvVar{Name:ISTIOCSR_OPERAND_IMAGE_VERSION,Value:0.14.2,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:1.18.0,ValueFrom:nil,},EnvVar{Name:OPERATOR_LOG_LEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:TRUSTED_CA_CONFIGMAP_NAME,Value:,ValueFrom:nil,},EnvVar{Name:CLOUD_CREDENTIALS_SECRET_NAME,Value:,ValueFrom:nil,},EnvVar{Name:UNSUPPORTED_ADDON_FEATURES,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cert-manager-operator.v1.18.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{33554432 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tmp,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96bh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-operator-controller-manager-5446d6888b-dxcpb_cert-manager-operator(10d3aa7e-cb4b-4e86-8c38-1844f024c3b7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.567301 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" podUID="10d3aa7e-cb4b-4e86-8c38-1844f024c3b7" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.932737 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.933243 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(74e109ac-bfdd-4515-b3e3-3140c7372844): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.935879 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="74e109ac-bfdd-4515-b3e3-3140c7372844" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.977346 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="74e109ac-bfdd-4515-b3e3-3140c7372844" Jan 26 00:21:24 crc kubenswrapper[4874]: E0126 00:21:24.977508 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/cert-manager-operator-rhel9@sha256:fa8de363ab4435c1085ac37f1bad488828c6ae8ba361c5f865c27ef577610911\\\"\"" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" podUID="10d3aa7e-cb4b-4e86-8c38-1844f024c3b7" Jan 26 00:21:25 crc kubenswrapper[4874]: I0126 00:21:25.093659 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 26 00:21:25 crc kubenswrapper[4874]: I0126 00:21:25.133605 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 26 00:21:25 crc kubenswrapper[4874]: I0126 00:21:25.187277 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 26 00:21:25 crc kubenswrapper[4874]: I0126 00:21:25.981838 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"9ed367b7-643b-419e-a14a-df6cd386156c","Type":"ContainerStarted","Data":"aff91b83d3604dd6756c0c8615e156f32082851e657cd75e42a157eb86f59065"} Jan 26 00:21:25 crc kubenswrapper[4874]: E0126 00:21:25.984401 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="74e109ac-bfdd-4515-b3e3-3140c7372844" Jan 26 00:21:26 crc kubenswrapper[4874]: E0126 00:21:26.989545 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="74e109ac-bfdd-4515-b3e3-3140c7372844" Jan 26 00:21:32 crc kubenswrapper[4874]: I0126 00:21:32.013293 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 26 00:21:32 crc kubenswrapper[4874]: I0126 00:21:32.030232 4874 generic.go:334] "Generic (PLEG): container finished" podID="9ed367b7-643b-419e-a14a-df6cd386156c" containerID="0da17356cf65073a8e69a4306be07393f1768fb3475816a3b551558b0be8ebd8" exitCode=0 Jan 26 00:21:32 crc kubenswrapper[4874]: I0126 00:21:32.030296 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"9ed367b7-643b-419e-a14a-df6cd386156c","Type":"ContainerDied","Data":"0da17356cf65073a8e69a4306be07393f1768fb3475816a3b551558b0be8ebd8"} Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.040111 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"9ed367b7-643b-419e-a14a-df6cd386156c","Type":"ContainerStarted","Data":"06393c18c41dbd484289871fd41c42d417ed1a792b5f593109f1a8d7b96690fa"} Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.040256 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="9ed367b7-643b-419e-a14a-df6cd386156c" containerName="docker-build" containerID="cri-o://06393c18c41dbd484289871fd41c42d417ed1a792b5f593109f1a8d7b96690fa" gracePeriod=30 Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.071413 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=5.83265377 podStartE2EDuration="12.071379727s" podCreationTimestamp="2026-01-26 00:21:21 +0000 UTC" firstStartedPulling="2026-01-26 00:21:25.193274677 +0000 UTC m=+859.927381214" lastFinishedPulling="2026-01-26 00:21:31.432000634 +0000 UTC m=+866.166107171" observedRunningTime="2026-01-26 00:21:33.069367701 +0000 UTC m=+867.803474238" watchObservedRunningTime="2026-01-26 00:21:33.071379727 +0000 UTC m=+867.805486254" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.809792 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.811731 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.813898 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.814680 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.814796 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.852833 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.970662 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.970735 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.970769 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.970794 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.970814 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.970974 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjrb\" (UniqueName: \"kubernetes.io/projected/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-kube-api-access-tdjrb\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.971065 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.971171 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.971212 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.971309 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.971344 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:33 crc kubenswrapper[4874]: I0126 00:21:33.971372 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072365 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072465 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072508 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072589 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072593 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072626 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072662 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072702 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072751 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072791 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072828 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072861 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.072904 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjrb\" (UniqueName: \"kubernetes.io/projected/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-kube-api-access-tdjrb\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.073136 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.073200 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.073471 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.073669 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.073749 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.074014 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.074279 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.075490 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.079474 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.082828 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.097523 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjrb\" (UniqueName: \"kubernetes.io/projected/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-kube-api-access-tdjrb\") pod \"service-telemetry-operator-2-build\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:34 crc kubenswrapper[4874]: I0126 00:21:34.129439 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:21:38 crc kubenswrapper[4874]: I0126 00:21:38.203297 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 26 00:21:40 crc kubenswrapper[4874]: I0126 00:21:40.809077 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_9ed367b7-643b-419e-a14a-df6cd386156c/docker-build/0.log" Jan 26 00:21:40 crc kubenswrapper[4874]: I0126 00:21:40.810297 4874 generic.go:334] "Generic (PLEG): container finished" podID="9ed367b7-643b-419e-a14a-df6cd386156c" containerID="06393c18c41dbd484289871fd41c42d417ed1a792b5f593109f1a8d7b96690fa" exitCode=-1 Jan 26 00:21:40 crc kubenswrapper[4874]: I0126 00:21:40.810352 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"9ed367b7-643b-419e-a14a-df6cd386156c","Type":"ContainerDied","Data":"06393c18c41dbd484289871fd41c42d417ed1a792b5f593109f1a8d7b96690fa"} Jan 26 00:21:41 crc kubenswrapper[4874]: I0126 00:21:41.819445 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerStarted","Data":"baf263388196f67a11cf3a7c8ef401c877ee2f1d24aa440170856b44874d4802"} Jan 26 00:21:42 crc kubenswrapper[4874]: I0126 00:21:42.832876 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerStarted","Data":"334bc7e83f7a022f517a4d05c7e2cac5cf57ea338349be946ad656879c5702f4"} Jan 26 00:21:42 crc kubenswrapper[4874]: I0126 00:21:42.921290 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_9ed367b7-643b-419e-a14a-df6cd386156c/docker-build/0.log" Jan 26 00:21:42 crc kubenswrapper[4874]: I0126 00:21:42.922528 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.115651 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-proxy-ca-bundles\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.115768 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-build-blob-cache\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.115836 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-buildworkdir\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.115863 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-run\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.115899 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-push\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.115920 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-system-configs\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.115982 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-buildcachedir\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116053 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-ca-bundles\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116099 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-root\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116126 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-pull\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116187 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x6j6\" (UniqueName: \"kubernetes.io/projected/9ed367b7-643b-419e-a14a-df6cd386156c-kube-api-access-8x6j6\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116201 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116227 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-node-pullsecrets\") pod \"9ed367b7-643b-419e-a14a-df6cd386156c\" (UID: \"9ed367b7-643b-419e-a14a-df6cd386156c\") " Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116259 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116404 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116456 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116477 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116486 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.116841 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.117544 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.117657 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.117744 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.117954 4874 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.118062 4874 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.118149 4874 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.118243 4874 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.118308 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9ed367b7-643b-419e-a14a-df6cd386156c-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.118384 4874 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.118459 4874 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9ed367b7-643b-419e-a14a-df6cd386156c-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.123718 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-pull" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-pull") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "builder-dockercfg-pt6zh-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.124156 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed367b7-643b-419e-a14a-df6cd386156c-kube-api-access-8x6j6" (OuterVolumeSpecName: "kube-api-access-8x6j6") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "kube-api-access-8x6j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.127012 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-push" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-push") pod "9ed367b7-643b-419e-a14a-df6cd386156c" (UID: "9ed367b7-643b-419e-a14a-df6cd386156c"). InnerVolumeSpecName "builder-dockercfg-pt6zh-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.219751 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-push\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.219805 4874 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ed367b7-643b-419e-a14a-df6cd386156c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.219823 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/9ed367b7-643b-419e-a14a-df6cd386156c-builder-dockercfg-pt6zh-pull\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.219835 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x6j6\" (UniqueName: \"kubernetes.io/projected/9ed367b7-643b-419e-a14a-df6cd386156c-kube-api-access-8x6j6\") on node \"crc\" DevicePath \"\"" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.840262 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"74e109ac-bfdd-4515-b3e3-3140c7372844","Type":"ContainerStarted","Data":"b78fd896caecc75a0ad789702251fb233f4354ec4ecc5f1896c82befb8b848b3"} Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.842882 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_9ed367b7-643b-419e-a14a-df6cd386156c/docker-build/0.log" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.843297 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"9ed367b7-643b-419e-a14a-df6cd386156c","Type":"ContainerDied","Data":"aff91b83d3604dd6756c0c8615e156f32082851e657cd75e42a157eb86f59065"} Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.843340 4874 scope.go:117] "RemoveContainer" containerID="06393c18c41dbd484289871fd41c42d417ed1a792b5f593109f1a8d7b96690fa" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.843698 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.845287 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" event={"ID":"10d3aa7e-cb4b-4e86-8c38-1844f024c3b7","Type":"ContainerStarted","Data":"3b311817ed6106e3df026c9569bef13b87f3d99c3eba8d0dbc28a2719283b5f9"} Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.863562 4874 scope.go:117] "RemoveContainer" containerID="0da17356cf65073a8e69a4306be07393f1768fb3475816a3b551558b0be8ebd8" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.919111 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-dxcpb" podStartSLOduration=3.89133957 podStartE2EDuration="39.919080136s" podCreationTimestamp="2026-01-26 00:21:04 +0000 UTC" firstStartedPulling="2026-01-26 00:21:06.841208816 +0000 UTC m=+841.575315353" lastFinishedPulling="2026-01-26 00:21:42.868949362 +0000 UTC m=+877.603055919" observedRunningTime="2026-01-26 00:21:43.917679507 +0000 UTC m=+878.651786044" watchObservedRunningTime="2026-01-26 00:21:43.919080136 +0000 UTC m=+878.653186673" Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.965867 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 26 00:21:43 crc kubenswrapper[4874]: I0126 00:21:43.979918 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 26 00:21:44 crc kubenswrapper[4874]: I0126 00:21:44.853265 4874 generic.go:334] "Generic (PLEG): container finished" podID="74e109ac-bfdd-4515-b3e3-3140c7372844" containerID="b78fd896caecc75a0ad789702251fb233f4354ec4ecc5f1896c82befb8b848b3" exitCode=0 Jan 26 00:21:44 crc kubenswrapper[4874]: I0126 00:21:44.853402 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"74e109ac-bfdd-4515-b3e3-3140c7372844","Type":"ContainerDied","Data":"b78fd896caecc75a0ad789702251fb233f4354ec4ecc5f1896c82befb8b848b3"} Jan 26 00:21:45 crc kubenswrapper[4874]: I0126 00:21:45.702669 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed367b7-643b-419e-a14a-df6cd386156c" path="/var/lib/kubelet/pods/9ed367b7-643b-419e-a14a-df6cd386156c/volumes" Jan 26 00:21:45 crc kubenswrapper[4874]: I0126 00:21:45.863432 4874 generic.go:334] "Generic (PLEG): container finished" podID="74e109ac-bfdd-4515-b3e3-3140c7372844" containerID="dad2567fe288f3bec03da60531cc37efaf77dac745789a392356f3713443fa9a" exitCode=0 Jan 26 00:21:45 crc kubenswrapper[4874]: I0126 00:21:45.863500 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"74e109ac-bfdd-4515-b3e3-3140c7372844","Type":"ContainerDied","Data":"dad2567fe288f3bec03da60531cc37efaf77dac745789a392356f3713443fa9a"} Jan 26 00:21:46 crc kubenswrapper[4874]: I0126 00:21:46.872617 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"74e109ac-bfdd-4515-b3e3-3140c7372844","Type":"ContainerStarted","Data":"c65fe8716fa0ec090a63cf291a2d90f529a15b8f1e252220965971299f2f74de"} Jan 26 00:21:46 crc kubenswrapper[4874]: I0126 00:21:46.872900 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:21:46 crc kubenswrapper[4874]: I0126 00:21:46.919784 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=5.119366634 podStartE2EDuration="51.919756217s" podCreationTimestamp="2026-01-26 00:20:55 +0000 UTC" firstStartedPulling="2026-01-26 00:20:56.075031809 +0000 UTC m=+830.809138346" lastFinishedPulling="2026-01-26 00:21:42.875421402 +0000 UTC m=+877.609527929" observedRunningTime="2026-01-26 00:21:46.912846255 +0000 UTC m=+881.646952792" watchObservedRunningTime="2026-01-26 00:21:46.919756217 +0000 UTC m=+881.653862754" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.117121 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-grbvb"] Jan 26 00:21:48 crc kubenswrapper[4874]: E0126 00:21:48.118070 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed367b7-643b-419e-a14a-df6cd386156c" containerName="docker-build" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.118091 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed367b7-643b-419e-a14a-df6cd386156c" containerName="docker-build" Jan 26 00:21:48 crc kubenswrapper[4874]: E0126 00:21:48.118113 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed367b7-643b-419e-a14a-df6cd386156c" containerName="manage-dockerfile" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.118122 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed367b7-643b-419e-a14a-df6cd386156c" containerName="manage-dockerfile" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.118311 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed367b7-643b-419e-a14a-df6cd386156c" containerName="docker-build" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.120109 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.123648 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.123900 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.124057 4874 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7nqlx" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.131068 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-grbvb"] Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.228952 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrc5\" (UniqueName: \"kubernetes.io/projected/595eb0b5-4780-4c76-a7a9-a453173478f0-kube-api-access-mrrc5\") pod \"cert-manager-webhook-f4fb5df64-grbvb\" (UID: \"595eb0b5-4780-4c76-a7a9-a453173478f0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.229069 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595eb0b5-4780-4c76-a7a9-a453173478f0-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-grbvb\" (UID: \"595eb0b5-4780-4c76-a7a9-a453173478f0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.330268 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrc5\" (UniqueName: \"kubernetes.io/projected/595eb0b5-4780-4c76-a7a9-a453173478f0-kube-api-access-mrrc5\") pod \"cert-manager-webhook-f4fb5df64-grbvb\" (UID: \"595eb0b5-4780-4c76-a7a9-a453173478f0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.330321 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595eb0b5-4780-4c76-a7a9-a453173478f0-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-grbvb\" (UID: \"595eb0b5-4780-4c76-a7a9-a453173478f0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.359203 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrc5\" (UniqueName: \"kubernetes.io/projected/595eb0b5-4780-4c76-a7a9-a453173478f0-kube-api-access-mrrc5\") pod \"cert-manager-webhook-f4fb5df64-grbvb\" (UID: \"595eb0b5-4780-4c76-a7a9-a453173478f0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.359660 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595eb0b5-4780-4c76-a7a9-a453173478f0-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-grbvb\" (UID: \"595eb0b5-4780-4c76-a7a9-a453173478f0\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.440774 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.971818 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb"] Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.973721 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.981502 4874 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-7nzgs" Jan 26 00:21:48 crc kubenswrapper[4874]: I0126 00:21:48.987117 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb"] Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.078108 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68e61145-f6f9-4493-9d2b-73bda9d3cfad-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-8ddhb\" (UID: \"68e61145-f6f9-4493-9d2b-73bda9d3cfad\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.078181 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9fv\" (UniqueName: \"kubernetes.io/projected/68e61145-f6f9-4493-9d2b-73bda9d3cfad-kube-api-access-7n9fv\") pod \"cert-manager-cainjector-855d9ccff4-8ddhb\" (UID: \"68e61145-f6f9-4493-9d2b-73bda9d3cfad\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.179706 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68e61145-f6f9-4493-9d2b-73bda9d3cfad-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-8ddhb\" (UID: \"68e61145-f6f9-4493-9d2b-73bda9d3cfad\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.179781 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n9fv\" (UniqueName: \"kubernetes.io/projected/68e61145-f6f9-4493-9d2b-73bda9d3cfad-kube-api-access-7n9fv\") pod \"cert-manager-cainjector-855d9ccff4-8ddhb\" (UID: \"68e61145-f6f9-4493-9d2b-73bda9d3cfad\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.213508 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68e61145-f6f9-4493-9d2b-73bda9d3cfad-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-8ddhb\" (UID: \"68e61145-f6f9-4493-9d2b-73bda9d3cfad\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.213645 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n9fv\" (UniqueName: \"kubernetes.io/projected/68e61145-f6f9-4493-9d2b-73bda9d3cfad-kube-api-access-7n9fv\") pod \"cert-manager-cainjector-855d9ccff4-8ddhb\" (UID: \"68e61145-f6f9-4493-9d2b-73bda9d3cfad\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.252380 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-grbvb"] Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.361022 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" Jan 26 00:21:49 crc kubenswrapper[4874]: W0126 00:21:49.403225 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod595eb0b5_4780_4c76_a7a9_a453173478f0.slice/crio-48fcd79c9ab09f87013e9e19fe5084a5e54c8cd1cd2a92bc5fbb6b17f537aea5 WatchSource:0}: Error finding container 48fcd79c9ab09f87013e9e19fe5084a5e54c8cd1cd2a92bc5fbb6b17f537aea5: Status 404 returned error can't find the container with id 48fcd79c9ab09f87013e9e19fe5084a5e54c8cd1cd2a92bc5fbb6b17f537aea5 Jan 26 00:21:49 crc kubenswrapper[4874]: I0126 00:21:49.902865 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" event={"ID":"595eb0b5-4780-4c76-a7a9-a453173478f0","Type":"ContainerStarted","Data":"48fcd79c9ab09f87013e9e19fe5084a5e54c8cd1cd2a92bc5fbb6b17f537aea5"} Jan 26 00:21:50 crc kubenswrapper[4874]: I0126 00:21:50.073475 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb"] Jan 26 00:21:50 crc kubenswrapper[4874]: I0126 00:21:50.910601 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" event={"ID":"68e61145-f6f9-4493-9d2b-73bda9d3cfad","Type":"ContainerStarted","Data":"e43f142fe7cd5df9052e8e36bc0d5a4a44ff1622d61bda4b0553118bbdc61402"} Jan 26 00:21:53 crc kubenswrapper[4874]: I0126 00:21:52.925368 4874 generic.go:334] "Generic (PLEG): container finished" podID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerID="334bc7e83f7a022f517a4d05c7e2cac5cf57ea338349be946ad656879c5702f4" exitCode=0 Jan 26 00:21:53 crc kubenswrapper[4874]: I0126 00:21:52.925827 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerDied","Data":"334bc7e83f7a022f517a4d05c7e2cac5cf57ea338349be946ad656879c5702f4"} Jan 26 00:21:53 crc kubenswrapper[4874]: I0126 00:21:53.946892 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerStarted","Data":"8247e89c4c63373c00116a82de29070d9222a0c9050923a9f72ea0f45aa76762"} Jan 26 00:21:55 crc kubenswrapper[4874]: I0126 00:21:55.068776 4874 generic.go:334] "Generic (PLEG): container finished" podID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerID="8247e89c4c63373c00116a82de29070d9222a0c9050923a9f72ea0f45aa76762" exitCode=0 Jan 26 00:21:55 crc kubenswrapper[4874]: I0126 00:21:55.068831 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerDied","Data":"8247e89c4c63373c00116a82de29070d9222a0c9050923a9f72ea0f45aa76762"} Jan 26 00:21:56 crc kubenswrapper[4874]: I0126 00:21:56.081241 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerStarted","Data":"83286bc94ffd5b6a784e65ae7787ad846c0e61f69a17dd09c9d443990452997a"} Jan 26 00:21:56 crc kubenswrapper[4874]: I0126 00:21:56.142895 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=23.142867553 podStartE2EDuration="23.142867553s" podCreationTimestamp="2026-01-26 00:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:21:56.140197509 +0000 UTC m=+890.874304056" watchObservedRunningTime="2026-01-26 00:21:56.142867553 +0000 UTC m=+890.876974090" Jan 26 00:22:01 crc kubenswrapper[4874]: I0126 00:22:01.311583 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="74e109ac-bfdd-4515-b3e3-3140c7372844" containerName="elasticsearch" probeResult="failure" output=< Jan 26 00:22:01 crc kubenswrapper[4874]: {"timestamp": "2026-01-26T00:22:01+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 26 00:22:01 crc kubenswrapper[4874]: > Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.001689 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="74e109ac-bfdd-4515-b3e3-3140c7372844" containerName="elasticsearch" probeResult="failure" output=< Jan 26 00:22:06 crc kubenswrapper[4874]: {"timestamp": "2026-01-26T00:22:05+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 26 00:22:06 crc kubenswrapper[4874]: > Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.155566 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-scjzz"] Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.156458 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.162577 4874 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-82b9d" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.172649 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-scjzz"] Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.340441 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspk8\" (UniqueName: \"kubernetes.io/projected/8bf4a267-b4eb-4717-9f5f-6a7b01536bb8-kube-api-access-vspk8\") pod \"cert-manager-86cb77c54b-scjzz\" (UID: \"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8\") " pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.340527 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bf4a267-b4eb-4717-9f5f-6a7b01536bb8-bound-sa-token\") pod \"cert-manager-86cb77c54b-scjzz\" (UID: \"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8\") " pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.450158 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspk8\" (UniqueName: \"kubernetes.io/projected/8bf4a267-b4eb-4717-9f5f-6a7b01536bb8-kube-api-access-vspk8\") pod \"cert-manager-86cb77c54b-scjzz\" (UID: \"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8\") " pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.450241 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bf4a267-b4eb-4717-9f5f-6a7b01536bb8-bound-sa-token\") pod \"cert-manager-86cb77c54b-scjzz\" (UID: \"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8\") " pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.478262 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspk8\" (UniqueName: \"kubernetes.io/projected/8bf4a267-b4eb-4717-9f5f-6a7b01536bb8-kube-api-access-vspk8\") pod \"cert-manager-86cb77c54b-scjzz\" (UID: \"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8\") " pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.480765 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bf4a267-b4eb-4717-9f5f-6a7b01536bb8-bound-sa-token\") pod \"cert-manager-86cb77c54b-scjzz\" (UID: \"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8\") " pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:06 crc kubenswrapper[4874]: I0126 00:22:06.553535 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-scjzz" Jan 26 00:22:10 crc kubenswrapper[4874]: I0126 00:22:10.918629 4874 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="74e109ac-bfdd-4515-b3e3-3140c7372844" containerName="elasticsearch" probeResult="failure" output=< Jan 26 00:22:10 crc kubenswrapper[4874]: {"timestamp": "2026-01-26T00:22:10+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 26 00:22:10 crc kubenswrapper[4874]: > Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.175483 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df" Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.176265 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-webhook,Image:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,Command:[/app/cmd/webhook/webhook],Args:[--dynamic-serving-ca-secret-name=cert-manager-webhook-ca --dynamic-serving-ca-secret-namespace=$(POD_NAMESPACE) --dynamic-serving-dns-names=cert-manager-webhook,cert-manager-webhook.$(POD_NAMESPACE),cert-manager-webhook.$(POD_NAMESPACE).svc --secure-port=10250 --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:10250,Protocol:TCP,HostIP:,},ContainerPort{Name:healthcheck,HostPort:0,ContainerPort:6080,Protocol:TCP,HostIP:,},ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mrrc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:60,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-webhook-f4fb5df64-grbvb_cert-manager(595eb0b5-4780-4c76-a7a9-a453173478f0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.177930 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" podUID="595eb0b5-4780-4c76-a7a9-a453173478f0" Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.308667 4874 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df" Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.308936 4874 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-cainjector,Image:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,Command:[/app/cmd/cainjector/cainjector],Args:[--leader-election-namespace=kube-system --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7n9fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-cainjector-855d9ccff4-8ddhb_cert-manager(68e61145-f6f9-4493-9d2b-73bda9d3cfad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.311017 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-cainjector\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" podUID="68e61145-f6f9-4493-9d2b-73bda9d3cfad" Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.694290 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-cainjector\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df\\\"\"" pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" podUID="68e61145-f6f9-4493-9d2b-73bda9d3cfad" Jan 26 00:22:13 crc kubenswrapper[4874]: E0126 00:22:13.695015 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df\\\"\"" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" podUID="595eb0b5-4780-4c76-a7a9-a453173478f0" Jan 26 00:22:13 crc kubenswrapper[4874]: I0126 00:22:13.803675 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-scjzz"] Jan 26 00:22:14 crc kubenswrapper[4874]: I0126 00:22:14.698774 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-scjzz" event={"ID":"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8","Type":"ContainerStarted","Data":"1577d4035662b5c3e5c2b43d6642e42c093b9a4983c18a3e8bfceb54b19a712f"} Jan 26 00:22:15 crc kubenswrapper[4874]: I0126 00:22:15.709553 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-scjzz" event={"ID":"8bf4a267-b4eb-4717-9f5f-6a7b01536bb8","Type":"ContainerStarted","Data":"1b2d3c5686957d1518ed640a1c33ac578d89dfee3829b6aa94def770bf39b471"} Jan 26 00:22:15 crc kubenswrapper[4874]: I0126 00:22:15.738598 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-scjzz" podStartSLOduration=8.555297396 podStartE2EDuration="9.738567348s" podCreationTimestamp="2026-01-26 00:22:06 +0000 UTC" firstStartedPulling="2026-01-26 00:22:13.816425129 +0000 UTC m=+908.550531666" lastFinishedPulling="2026-01-26 00:22:14.999695091 +0000 UTC m=+909.733801618" observedRunningTime="2026-01-26 00:22:15.734111744 +0000 UTC m=+910.468218281" watchObservedRunningTime="2026-01-26 00:22:15.738567348 +0000 UTC m=+910.472673885" Jan 26 00:22:17 crc kubenswrapper[4874]: I0126 00:22:17.022918 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Jan 26 00:22:26 crc kubenswrapper[4874]: I0126 00:22:26.864811 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" event={"ID":"595eb0b5-4780-4c76-a7a9-a453173478f0","Type":"ContainerStarted","Data":"0e7018c1f75ba13cf9d839bd231103616c0728d250173d83d781f71a0bc8cbc8"} Jan 26 00:22:26 crc kubenswrapper[4874]: I0126 00:22:26.867159 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:22:26 crc kubenswrapper[4874]: I0126 00:22:26.868572 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" event={"ID":"68e61145-f6f9-4493-9d2b-73bda9d3cfad","Type":"ContainerStarted","Data":"3dfa9088595c668b6ee02048e1eb5facfc3e4eca506a1448c3677ed5f758fa75"} Jan 26 00:22:26 crc kubenswrapper[4874]: I0126 00:22:26.891236 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" podStartSLOduration=-9223371997.963572 podStartE2EDuration="38.891204947s" podCreationTimestamp="2026-01-26 00:21:48 +0000 UTC" firstStartedPulling="2026-01-26 00:21:49.41190218 +0000 UTC m=+884.146008707" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:22:26.889697495 +0000 UTC m=+921.623804032" watchObservedRunningTime="2026-01-26 00:22:26.891204947 +0000 UTC m=+921.625311484" Jan 26 00:22:26 crc kubenswrapper[4874]: I0126 00:22:26.948377 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-8ddhb" podStartSLOduration=-9223371997.906425 podStartE2EDuration="38.948349525s" podCreationTimestamp="2026-01-26 00:21:48 +0000 UTC" firstStartedPulling="2026-01-26 00:21:50.086686727 +0000 UTC m=+884.820793274" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:22:26.947783429 +0000 UTC m=+921.681889976" watchObservedRunningTime="2026-01-26 00:22:26.948349525 +0000 UTC m=+921.682456062" Jan 26 00:22:33 crc kubenswrapper[4874]: I0126 00:22:33.445603 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-grbvb" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.301713 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lwh6l"] Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.304804 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.328538 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwh6l"] Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.413009 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-catalog-content\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.413168 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpfwm\" (UniqueName: \"kubernetes.io/projected/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-kube-api-access-kpfwm\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.413206 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-utilities\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.514939 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpfwm\" (UniqueName: \"kubernetes.io/projected/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-kube-api-access-kpfwm\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.515013 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-utilities\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.515071 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-catalog-content\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.515724 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-catalog-content\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.516304 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-utilities\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.540076 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpfwm\" (UniqueName: \"kubernetes.io/projected/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-kube-api-access-kpfwm\") pod \"community-operators-lwh6l\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.641273 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:49 crc kubenswrapper[4874]: I0126 00:22:49.983676 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwh6l"] Jan 26 00:22:50 crc kubenswrapper[4874]: I0126 00:22:50.077792 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwh6l" event={"ID":"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59","Type":"ContainerStarted","Data":"d0d59c71dbfa7f439559aedfc0e201978dd920365aebaeb4d1d758220cc9df8a"} Jan 26 00:22:51 crc kubenswrapper[4874]: I0126 00:22:51.085653 4874 generic.go:334] "Generic (PLEG): container finished" podID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerID="b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189" exitCode=0 Jan 26 00:22:51 crc kubenswrapper[4874]: I0126 00:22:51.085781 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwh6l" event={"ID":"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59","Type":"ContainerDied","Data":"b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189"} Jan 26 00:22:52 crc kubenswrapper[4874]: I0126 00:22:52.095544 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwh6l" event={"ID":"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59","Type":"ContainerStarted","Data":"59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e"} Jan 26 00:22:53 crc kubenswrapper[4874]: I0126 00:22:53.106813 4874 generic.go:334] "Generic (PLEG): container finished" podID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerID="59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e" exitCode=0 Jan 26 00:22:53 crc kubenswrapper[4874]: I0126 00:22:53.108517 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwh6l" event={"ID":"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59","Type":"ContainerDied","Data":"59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e"} Jan 26 00:22:54 crc kubenswrapper[4874]: I0126 00:22:54.118506 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwh6l" event={"ID":"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59","Type":"ContainerStarted","Data":"5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d"} Jan 26 00:22:54 crc kubenswrapper[4874]: I0126 00:22:54.142067 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lwh6l" podStartSLOduration=2.6026384609999997 podStartE2EDuration="5.142041867s" podCreationTimestamp="2026-01-26 00:22:49 +0000 UTC" firstStartedPulling="2026-01-26 00:22:51.088706854 +0000 UTC m=+945.822813391" lastFinishedPulling="2026-01-26 00:22:53.62811025 +0000 UTC m=+948.362216797" observedRunningTime="2026-01-26 00:22:54.139514027 +0000 UTC m=+948.873620574" watchObservedRunningTime="2026-01-26 00:22:54.142041867 +0000 UTC m=+948.876148404" Jan 26 00:22:59 crc kubenswrapper[4874]: I0126 00:22:59.653524 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:59 crc kubenswrapper[4874]: I0126 00:22:59.654137 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:22:59 crc kubenswrapper[4874]: I0126 00:22:59.715739 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:23:00 crc kubenswrapper[4874]: I0126 00:23:00.198760 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:23:00 crc kubenswrapper[4874]: I0126 00:23:00.264190 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lwh6l"] Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.171241 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lwh6l" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="registry-server" containerID="cri-o://5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d" gracePeriod=2 Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.586891 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.718698 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpfwm\" (UniqueName: \"kubernetes.io/projected/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-kube-api-access-kpfwm\") pod \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.718809 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-catalog-content\") pod \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.718845 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-utilities\") pod \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\" (UID: \"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59\") " Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.720046 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-utilities" (OuterVolumeSpecName: "utilities") pod "1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" (UID: "1fa6b1c7-0fd9-4113-888f-bd3c0b838e59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.728362 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-kube-api-access-kpfwm" (OuterVolumeSpecName: "kube-api-access-kpfwm") pod "1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" (UID: "1fa6b1c7-0fd9-4113-888f-bd3c0b838e59"). InnerVolumeSpecName "kube-api-access-kpfwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.821646 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpfwm\" (UniqueName: \"kubernetes.io/projected/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-kube-api-access-kpfwm\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:02 crc kubenswrapper[4874]: I0126 00:23:02.821685 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.181559 4874 generic.go:334] "Generic (PLEG): container finished" podID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerID="5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d" exitCode=0 Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.181618 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwh6l" event={"ID":"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59","Type":"ContainerDied","Data":"5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d"} Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.181655 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwh6l" event={"ID":"1fa6b1c7-0fd9-4113-888f-bd3c0b838e59","Type":"ContainerDied","Data":"d0d59c71dbfa7f439559aedfc0e201978dd920365aebaeb4d1d758220cc9df8a"} Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.181677 4874 scope.go:117] "RemoveContainer" containerID="5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.181829 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwh6l" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.204629 4874 scope.go:117] "RemoveContainer" containerID="59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.223668 4874 scope.go:117] "RemoveContainer" containerID="b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.249747 4874 scope.go:117] "RemoveContainer" containerID="5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d" Jan 26 00:23:03 crc kubenswrapper[4874]: E0126 00:23:03.250270 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d\": container with ID starting with 5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d not found: ID does not exist" containerID="5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.250321 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d"} err="failed to get container status \"5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d\": rpc error: code = NotFound desc = could not find container \"5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d\": container with ID starting with 5a81680d2caf89bac12a1adf33a46070db1c965f4cd08f8543827f58e76d4c8d not found: ID does not exist" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.250353 4874 scope.go:117] "RemoveContainer" containerID="59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e" Jan 26 00:23:03 crc kubenswrapper[4874]: E0126 00:23:03.250642 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e\": container with ID starting with 59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e not found: ID does not exist" containerID="59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.250682 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e"} err="failed to get container status \"59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e\": rpc error: code = NotFound desc = could not find container \"59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e\": container with ID starting with 59737fe5b1f0dcf1020656d543c73e0c7a802ae97d116eb1204815563d78659e not found: ID does not exist" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.250700 4874 scope.go:117] "RemoveContainer" containerID="b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189" Jan 26 00:23:03 crc kubenswrapper[4874]: E0126 00:23:03.250976 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189\": container with ID starting with b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189 not found: ID does not exist" containerID="b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.251000 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189"} err="failed to get container status \"b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189\": rpc error: code = NotFound desc = could not find container \"b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189\": container with ID starting with b074fdb4d1c8335d35e8b41cca208c0e6837457e7d86fc4f07fc25fd8d8d4189 not found: ID does not exist" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.340326 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" (UID: "1fa6b1c7-0fd9-4113-888f-bd3c0b838e59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.432416 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.513542 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lwh6l"] Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.522328 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lwh6l"] Jan 26 00:23:03 crc kubenswrapper[4874]: I0126 00:23:03.630609 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" path="/var/lib/kubelet/pods/1fa6b1c7-0fd9-4113-888f-bd3c0b838e59/volumes" Jan 26 00:23:22 crc kubenswrapper[4874]: I0126 00:23:22.444468 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:23:22 crc kubenswrapper[4874]: I0126 00:23:22.445539 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:23:30 crc kubenswrapper[4874]: I0126 00:23:30.381856 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_7dc9a88d-7adc-4707-a009-ecbbdf6ebbba/docker-build/0.log" Jan 26 00:23:30 crc kubenswrapper[4874]: I0126 00:23:30.383816 4874 generic.go:334] "Generic (PLEG): container finished" podID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerID="83286bc94ffd5b6a784e65ae7787ad846c0e61f69a17dd09c9d443990452997a" exitCode=1 Jan 26 00:23:30 crc kubenswrapper[4874]: I0126 00:23:30.383881 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerDied","Data":"83286bc94ffd5b6a784e65ae7787ad846c0e61f69a17dd09c9d443990452997a"} Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.682005 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_7dc9a88d-7adc-4707-a009-ecbbdf6ebbba/docker-build/0.log" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.683260 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839094 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdjrb\" (UniqueName: \"kubernetes.io/projected/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-kube-api-access-tdjrb\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839149 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-pull\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839200 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-system-configs\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839243 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-root\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839276 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-push\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839350 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildcachedir\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839383 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-node-pullsecrets\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839433 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-ca-bundles\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839456 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-blob-cache\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839500 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839843 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.839923 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-proxy-ca-bundles\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.840112 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-run\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.840152 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildworkdir\") pod \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\" (UID: \"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba\") " Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.840186 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.840951 4874 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.840997 4874 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.841012 4874 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.841910 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.842096 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.842895 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.846223 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-push" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-push") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "builder-dockercfg-pt6zh-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.846745 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-kube-api-access-tdjrb" (OuterVolumeSpecName: "kube-api-access-tdjrb") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "kube-api-access-tdjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.847475 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-pull" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-pull") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "builder-dockercfg-pt6zh-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.884703 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.943328 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.943726 4874 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.943812 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdjrb\" (UniqueName: \"kubernetes.io/projected/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-kube-api-access-tdjrb\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.943894 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-pull\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.943995 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-builder-dockercfg-pt6zh-push\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.944077 4874 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:31 crc kubenswrapper[4874]: I0126 00:23:31.944169 4874 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:32 crc kubenswrapper[4874]: I0126 00:23:32.017201 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:23:32 crc kubenswrapper[4874]: I0126 00:23:32.045608 4874 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:32 crc kubenswrapper[4874]: I0126 00:23:32.399414 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_7dc9a88d-7adc-4707-a009-ecbbdf6ebbba/docker-build/0.log" Jan 26 00:23:32 crc kubenswrapper[4874]: I0126 00:23:32.400937 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"7dc9a88d-7adc-4707-a009-ecbbdf6ebbba","Type":"ContainerDied","Data":"baf263388196f67a11cf3a7c8ef401c877ee2f1d24aa440170856b44874d4802"} Jan 26 00:23:32 crc kubenswrapper[4874]: I0126 00:23:32.400996 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf263388196f67a11cf3a7c8ef401c877ee2f1d24aa440170856b44874d4802" Jan 26 00:23:32 crc kubenswrapper[4874]: I0126 00:23:32.401136 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 26 00:23:33 crc kubenswrapper[4874]: I0126 00:23:33.776479 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" (UID: "7dc9a88d-7adc-4707-a009-ecbbdf6ebbba"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:23:33 crc kubenswrapper[4874]: I0126 00:23:33.877325 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7dc9a88d-7adc-4707-a009-ecbbdf6ebbba-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.058673 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 26 00:23:42 crc kubenswrapper[4874]: E0126 00:23:42.059519 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerName="manage-dockerfile" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059535 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerName="manage-dockerfile" Jan 26 00:23:42 crc kubenswrapper[4874]: E0126 00:23:42.059562 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="registry-server" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059570 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="registry-server" Jan 26 00:23:42 crc kubenswrapper[4874]: E0126 00:23:42.059582 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="extract-content" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059589 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="extract-content" Jan 26 00:23:42 crc kubenswrapper[4874]: E0126 00:23:42.059599 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerName="git-clone" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059606 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerName="git-clone" Jan 26 00:23:42 crc kubenswrapper[4874]: E0126 00:23:42.059614 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerName="docker-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059620 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerName="docker-build" Jan 26 00:23:42 crc kubenswrapper[4874]: E0126 00:23:42.059632 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="extract-utilities" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059638 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="extract-utilities" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059761 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc9a88d-7adc-4707-a009-ecbbdf6ebbba" containerName="docker-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.059783 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa6b1c7-0fd9-4113-888f-bd3c0b838e59" containerName="registry-server" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.060766 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.063349 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-sys-config" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.063547 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-ca" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.063680 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-global-ca" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.064311 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-pt6zh" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.086662 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.195547 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.195707 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.195770 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196012 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196097 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196154 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196180 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196272 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196291 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvfgt\" (UniqueName: \"kubernetes.io/projected/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-kube-api-access-jvfgt\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196321 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196368 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.196394 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297318 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297398 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297448 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297473 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvfgt\" (UniqueName: \"kubernetes.io/projected/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-kube-api-access-jvfgt\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297505 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297531 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297554 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297607 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297631 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297655 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297696 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297722 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297816 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.297547 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.298084 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.298430 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.298577 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.298677 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.299129 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.299466 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.299674 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.305632 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.312139 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.320261 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvfgt\" (UniqueName: \"kubernetes.io/projected/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-kube-api-access-jvfgt\") pod \"service-telemetry-operator-3-build\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.383568 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:23:42 crc kubenswrapper[4874]: I0126 00:23:42.660613 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 26 00:23:43 crc kubenswrapper[4874]: I0126 00:23:43.479195 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9","Type":"ContainerStarted","Data":"e4dc8b4aa259c063df3d487f653cc372cd5f8d784647a349d8c302944f91b654"} Jan 26 00:23:43 crc kubenswrapper[4874]: I0126 00:23:43.479542 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9","Type":"ContainerStarted","Data":"828548ce68bd0f6e6e425ff34b9ad42eb6e8f5d03ad2b26cdd8bce01bb053eb4"} Jan 26 00:23:52 crc kubenswrapper[4874]: I0126 00:23:52.444246 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:23:52 crc kubenswrapper[4874]: I0126 00:23:52.445237 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:23:52 crc kubenswrapper[4874]: I0126 00:23:52.564283 4874 generic.go:334] "Generic (PLEG): container finished" podID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerID="e4dc8b4aa259c063df3d487f653cc372cd5f8d784647a349d8c302944f91b654" exitCode=0 Jan 26 00:23:52 crc kubenswrapper[4874]: I0126 00:23:52.564338 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9","Type":"ContainerDied","Data":"e4dc8b4aa259c063df3d487f653cc372cd5f8d784647a349d8c302944f91b654"} Jan 26 00:23:53 crc kubenswrapper[4874]: I0126 00:23:53.573159 4874 generic.go:334] "Generic (PLEG): container finished" podID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerID="1b0a0f64c8447e2161f8933db8a7f5abf1f34e7bae266e2903653ecf6303258b" exitCode=0 Jan 26 00:23:53 crc kubenswrapper[4874]: I0126 00:23:53.573224 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9","Type":"ContainerDied","Data":"1b0a0f64c8447e2161f8933db8a7f5abf1f34e7bae266e2903653ecf6303258b"} Jan 26 00:23:53 crc kubenswrapper[4874]: I0126 00:23:53.696315 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_ea9f1e83-5a38-47ee-90e3-16a818f4d0f9/manage-dockerfile/0.log" Jan 26 00:23:54 crc kubenswrapper[4874]: I0126 00:23:54.584658 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9","Type":"ContainerStarted","Data":"9e212c93b7fa50022fce10d9ecddcf5edd827949aee613f1a6f21da7b6a55ffc"} Jan 26 00:23:54 crc kubenswrapper[4874]: I0126 00:23:54.618911 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-3-build" podStartSLOduration=13.618875014 podStartE2EDuration="13.618875014s" podCreationTimestamp="2026-01-26 00:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:23:54.614405231 +0000 UTC m=+1009.348511778" watchObservedRunningTime="2026-01-26 00:23:54.618875014 +0000 UTC m=+1009.352981591" Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.445017 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.445541 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.445596 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.446131 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ee75d759155b8832b753c645587cef74c69208f7e96da35246abb4b342d40fa"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.446190 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://3ee75d759155b8832b753c645587cef74c69208f7e96da35246abb4b342d40fa" gracePeriod=600 Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.797365 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"3ee75d759155b8832b753c645587cef74c69208f7e96da35246abb4b342d40fa"} Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.797836 4874 scope.go:117] "RemoveContainer" containerID="80cb2c7272f4e2f3e98351d6918466cfb5e254dd1e4e8cac8003d20db9e8a1cb" Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.797314 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="3ee75d759155b8832b753c645587cef74c69208f7e96da35246abb4b342d40fa" exitCode=0 Jan 26 00:24:22 crc kubenswrapper[4874]: I0126 00:24:22.799117 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431"} Jan 26 00:25:04 crc kubenswrapper[4874]: I0126 00:25:04.343010 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_ea9f1e83-5a38-47ee-90e3-16a818f4d0f9/docker-build/0.log" Jan 26 00:25:04 crc kubenswrapper[4874]: I0126 00:25:04.344846 4874 generic.go:334] "Generic (PLEG): container finished" podID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerID="9e212c93b7fa50022fce10d9ecddcf5edd827949aee613f1a6f21da7b6a55ffc" exitCode=1 Jan 26 00:25:04 crc kubenswrapper[4874]: I0126 00:25:04.344897 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9","Type":"ContainerDied","Data":"9e212c93b7fa50022fce10d9ecddcf5edd827949aee613f1a6f21da7b6a55ffc"} Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.642256 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_ea9f1e83-5a38-47ee-90e3-16a818f4d0f9/docker-build/0.log" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.643437 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.717309 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-pull\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.717437 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvfgt\" (UniqueName: \"kubernetes.io/projected/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-kube-api-access-jvfgt\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.717480 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-root\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.717508 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-system-configs\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718395 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildworkdir\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718488 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-ca-bundles\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718510 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-proxy-ca-bundles\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718546 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-push\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718565 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-node-pullsecrets\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718581 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-blob-cache\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718608 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildcachedir\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.718681 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-run\") pod \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\" (UID: \"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9\") " Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.719011 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.719434 4874 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.720272 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.723662 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-pull" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-pull") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "builder-dockercfg-pt6zh-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.724503 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.724661 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.725241 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.732154 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.734235 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-kube-api-access-jvfgt" (OuterVolumeSpecName: "kube-api-access-jvfgt") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "kube-api-access-jvfgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.735239 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-push" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-push") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "builder-dockercfg-pt6zh-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.761344 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821457 4874 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821528 4874 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821539 4874 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821552 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-push\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821563 4874 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821572 4874 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821581 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821589 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-builder-dockercfg-pt6zh-pull\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.821598 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvfgt\" (UniqueName: \"kubernetes.io/projected/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-kube-api-access-jvfgt\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.913264 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:25:05 crc kubenswrapper[4874]: I0126 00:25:05.922764 4874 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:06 crc kubenswrapper[4874]: I0126 00:25:06.376491 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_ea9f1e83-5a38-47ee-90e3-16a818f4d0f9/docker-build/0.log" Jan 26 00:25:06 crc kubenswrapper[4874]: I0126 00:25:06.380663 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"ea9f1e83-5a38-47ee-90e3-16a818f4d0f9","Type":"ContainerDied","Data":"828548ce68bd0f6e6e425ff34b9ad42eb6e8f5d03ad2b26cdd8bce01bb053eb4"} Jan 26 00:25:06 crc kubenswrapper[4874]: I0126 00:25:06.380719 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="828548ce68bd0f6e6e425ff34b9ad42eb6e8f5d03ad2b26cdd8bce01bb053eb4" Jan 26 00:25:06 crc kubenswrapper[4874]: I0126 00:25:06.380824 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 26 00:25:07 crc kubenswrapper[4874]: I0126 00:25:07.537827 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" (UID: "ea9f1e83-5a38-47ee-90e3-16a818f4d0f9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:25:07 crc kubenswrapper[4874]: I0126 00:25:07.545212 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ea9f1e83-5a38-47ee-90e3-16a818f4d0f9-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.853640 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 26 00:25:16 crc kubenswrapper[4874]: E0126 00:25:16.855003 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerName="docker-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.855025 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerName="docker-build" Jan 26 00:25:16 crc kubenswrapper[4874]: E0126 00:25:16.855063 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerName="manage-dockerfile" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.855073 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerName="manage-dockerfile" Jan 26 00:25:16 crc kubenswrapper[4874]: E0126 00:25:16.855089 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerName="git-clone" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.855098 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerName="git-clone" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.855258 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9f1e83-5a38-47ee-90e3-16a818f4d0f9" containerName="docker-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.856500 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.859253 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-sys-config" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.859336 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-ca" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.859268 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-global-ca" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.859526 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-pt6zh" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.876490 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.980206 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.980279 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkhzq\" (UniqueName: \"kubernetes.io/projected/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-kube-api-access-mkhzq\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.980820 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.980940 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981052 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981138 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981218 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981340 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981381 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981422 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981494 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:16 crc kubenswrapper[4874]: I0126 00:25:16.981526 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.082884 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.082969 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083004 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083035 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkhzq\" (UniqueName: \"kubernetes.io/projected/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-kube-api-access-mkhzq\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083062 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083089 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083105 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083126 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083140 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083174 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083193 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083222 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.083769 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085027 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085031 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085091 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085273 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085435 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085460 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085615 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.085471 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.092168 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.092209 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.103687 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkhzq\" (UniqueName: \"kubernetes.io/projected/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-kube-api-access-mkhzq\") pod \"service-telemetry-operator-4-build\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.177066 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:25:17 crc kubenswrapper[4874]: I0126 00:25:17.639670 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 26 00:25:18 crc kubenswrapper[4874]: I0126 00:25:18.484068 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3","Type":"ContainerStarted","Data":"ce97e0da4961891210524bd6cf19b0ec1d649d5c1cc8448701ecddc9e02c5dee"} Jan 26 00:25:18 crc kubenswrapper[4874]: I0126 00:25:18.484119 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3","Type":"ContainerStarted","Data":"8b2e4ad26bcc5863203cea9da0d2188a28de0d23f53b91792cc586333572024d"} Jan 26 00:25:27 crc kubenswrapper[4874]: I0126 00:25:27.553160 4874 generic.go:334] "Generic (PLEG): container finished" podID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerID="ce97e0da4961891210524bd6cf19b0ec1d649d5c1cc8448701ecddc9e02c5dee" exitCode=0 Jan 26 00:25:27 crc kubenswrapper[4874]: I0126 00:25:27.553312 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3","Type":"ContainerDied","Data":"ce97e0da4961891210524bd6cf19b0ec1d649d5c1cc8448701ecddc9e02c5dee"} Jan 26 00:25:29 crc kubenswrapper[4874]: I0126 00:25:29.579302 4874 generic.go:334] "Generic (PLEG): container finished" podID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerID="89e7bea278606914376b4f66040ee451a1c654b48844727f2a1e0ef6bb60f452" exitCode=0 Jan 26 00:25:29 crc kubenswrapper[4874]: I0126 00:25:29.579389 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3","Type":"ContainerDied","Data":"89e7bea278606914376b4f66040ee451a1c654b48844727f2a1e0ef6bb60f452"} Jan 26 00:25:29 crc kubenswrapper[4874]: I0126 00:25:29.632501 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3/manage-dockerfile/0.log" Jan 26 00:25:30 crc kubenswrapper[4874]: I0126 00:25:30.595360 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3","Type":"ContainerStarted","Data":"a39677853120a33aa1ef1a503e155c2988ed4bf0f1cfa3480896ad7952ed0547"} Jan 26 00:25:30 crc kubenswrapper[4874]: I0126 00:25:30.630483 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-4-build" podStartSLOduration=14.630465947 podStartE2EDuration="14.630465947s" podCreationTimestamp="2026-01-26 00:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:25:30.628119582 +0000 UTC m=+1105.362226109" watchObservedRunningTime="2026-01-26 00:25:30.630465947 +0000 UTC m=+1105.364572484" Jan 26 00:26:22 crc kubenswrapper[4874]: I0126 00:26:22.444272 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:26:22 crc kubenswrapper[4874]: I0126 00:26:22.445633 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:26:43 crc kubenswrapper[4874]: I0126 00:26:43.119721 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3/docker-build/0.log" Jan 26 00:26:43 crc kubenswrapper[4874]: I0126 00:26:43.121518 4874 generic.go:334] "Generic (PLEG): container finished" podID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerID="a39677853120a33aa1ef1a503e155c2988ed4bf0f1cfa3480896ad7952ed0547" exitCode=1 Jan 26 00:26:43 crc kubenswrapper[4874]: I0126 00:26:43.121597 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3","Type":"ContainerDied","Data":"a39677853120a33aa1ef1a503e155c2988ed4bf0f1cfa3480896ad7952ed0547"} Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.409019 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3/docker-build/0.log" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.410328 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.535556 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-push\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.535669 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildworkdir\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.535758 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildcachedir\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.535814 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-root\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.535843 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkhzq\" (UniqueName: \"kubernetes.io/projected/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-kube-api-access-mkhzq\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.535913 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-proxy-ca-bundles\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.535941 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-pull\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.536002 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-node-pullsecrets\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.536037 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-blob-cache\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.536064 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-ca-bundles\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.536113 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-system-configs\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.536163 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-run\") pod \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\" (UID: \"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3\") " Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.536580 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.536751 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537165 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537215 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537409 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537868 4874 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537899 4874 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537917 4874 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537929 4874 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.537942 4874 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.539652 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.543182 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-pull" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-pull") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "builder-dockercfg-pt6zh-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.544422 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-push" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-push") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "builder-dockercfg-pt6zh-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.545004 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-kube-api-access-mkhzq" (OuterVolumeSpecName: "kube-api-access-mkhzq") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "kube-api-access-mkhzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.585942 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.639765 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-pull\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.639815 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.639825 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-builder-dockercfg-pt6zh-push\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.639837 4874 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.639852 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkhzq\" (UniqueName: \"kubernetes.io/projected/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-kube-api-access-mkhzq\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.733344 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:26:44 crc kubenswrapper[4874]: I0126 00:26:44.740521 4874 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:45 crc kubenswrapper[4874]: I0126 00:26:45.137898 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3/docker-build/0.log" Jan 26 00:26:45 crc kubenswrapper[4874]: I0126 00:26:45.139314 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3","Type":"ContainerDied","Data":"8b2e4ad26bcc5863203cea9da0d2188a28de0d23f53b91792cc586333572024d"} Jan 26 00:26:45 crc kubenswrapper[4874]: I0126 00:26:45.139372 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b2e4ad26bcc5863203cea9da0d2188a28de0d23f53b91792cc586333572024d" Jan 26 00:26:45 crc kubenswrapper[4874]: I0126 00:26:45.139433 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 26 00:26:46 crc kubenswrapper[4874]: I0126 00:26:46.591522 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" (UID: "f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:26:46 crc kubenswrapper[4874]: I0126 00:26:46.676577 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 26 00:26:52 crc kubenswrapper[4874]: I0126 00:26:52.444941 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:26:52 crc kubenswrapper[4874]: I0126 00:26:52.445602 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.237345 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 26 00:26:55 crc kubenswrapper[4874]: E0126 00:26:55.238725 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerName="docker-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.238744 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerName="docker-build" Jan 26 00:26:55 crc kubenswrapper[4874]: E0126 00:26:55.238774 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerName="manage-dockerfile" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.238782 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerName="manage-dockerfile" Jan 26 00:26:55 crc kubenswrapper[4874]: E0126 00:26:55.238795 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerName="git-clone" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.238801 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerName="git-clone" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.238915 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f7b9cb-c3c2-4ae4-9c83-d2e14ae669d3" containerName="docker-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.239878 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.242319 4874 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-pt6zh" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.242359 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-ca" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.244227 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-global-ca" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.244375 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-sys-config" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.261837 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.417746 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.417824 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.417866 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.417939 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.417984 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clffd\" (UniqueName: \"kubernetes.io/projected/317a15bc-5848-41a5-8fa3-b6832e63363e-kube-api-access-clffd\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.418211 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.418328 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.418426 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.418520 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.418563 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.418643 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.418693 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520205 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520329 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520393 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520495 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520573 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520651 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520699 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clffd\" (UniqueName: \"kubernetes.io/projected/317a15bc-5848-41a5-8fa3-b6832e63363e-kube-api-access-clffd\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520746 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520791 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520869 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520916 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.520956 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.521683 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.521950 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.522044 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.522039 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.522574 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.522708 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.522882 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.523118 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.523988 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.529089 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.531640 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-push\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.543462 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clffd\" (UniqueName: \"kubernetes.io/projected/317a15bc-5848-41a5-8fa3-b6832e63363e-kube-api-access-clffd\") pod \"service-telemetry-operator-5-build\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.561369 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:26:55 crc kubenswrapper[4874]: I0126 00:26:55.839514 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 26 00:26:56 crc kubenswrapper[4874]: I0126 00:26:56.235351 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"317a15bc-5848-41a5-8fa3-b6832e63363e","Type":"ContainerStarted","Data":"1bb6583f8c252317cdf1bf47a17007c73707a5751a4c27c60a1f0c266741323b"} Jan 26 00:26:56 crc kubenswrapper[4874]: I0126 00:26:56.235783 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"317a15bc-5848-41a5-8fa3-b6832e63363e","Type":"ContainerStarted","Data":"c4b938c8330bb5c978aaac75ce4468865ec8915c5f1985188a514e11aabe1e29"} Jan 26 00:27:05 crc kubenswrapper[4874]: I0126 00:27:05.312064 4874 generic.go:334] "Generic (PLEG): container finished" podID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerID="1bb6583f8c252317cdf1bf47a17007c73707a5751a4c27c60a1f0c266741323b" exitCode=0 Jan 26 00:27:05 crc kubenswrapper[4874]: I0126 00:27:05.312202 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"317a15bc-5848-41a5-8fa3-b6832e63363e","Type":"ContainerDied","Data":"1bb6583f8c252317cdf1bf47a17007c73707a5751a4c27c60a1f0c266741323b"} Jan 26 00:27:06 crc kubenswrapper[4874]: I0126 00:27:06.326215 4874 generic.go:334] "Generic (PLEG): container finished" podID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerID="02c57633e68ac5df3e9427cbbbe7985987f01058a5300f3ba938ace7c7d3200f" exitCode=0 Jan 26 00:27:06 crc kubenswrapper[4874]: I0126 00:27:06.326354 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"317a15bc-5848-41a5-8fa3-b6832e63363e","Type":"ContainerDied","Data":"02c57633e68ac5df3e9427cbbbe7985987f01058a5300f3ba938ace7c7d3200f"} Jan 26 00:27:06 crc kubenswrapper[4874]: I0126 00:27:06.397622 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_317a15bc-5848-41a5-8fa3-b6832e63363e/manage-dockerfile/0.log" Jan 26 00:27:07 crc kubenswrapper[4874]: I0126 00:27:07.338112 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"317a15bc-5848-41a5-8fa3-b6832e63363e","Type":"ContainerStarted","Data":"c1f1c388e02a90f96d407fd940d8afe6f71c7117cec34f5e4c82fad59dbbf84a"} Jan 26 00:27:07 crc kubenswrapper[4874]: I0126 00:27:07.374477 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-5-build" podStartSLOduration=12.374448352 podStartE2EDuration="12.374448352s" podCreationTimestamp="2026-01-26 00:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-26 00:27:07.373179727 +0000 UTC m=+1202.107286274" watchObservedRunningTime="2026-01-26 00:27:07.374448352 +0000 UTC m=+1202.108554899" Jan 26 00:27:22 crc kubenswrapper[4874]: I0126 00:27:22.447801 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:27:22 crc kubenswrapper[4874]: I0126 00:27:22.448808 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:27:22 crc kubenswrapper[4874]: I0126 00:27:22.448893 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:27:22 crc kubenswrapper[4874]: I0126 00:27:22.449899 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:27:22 crc kubenswrapper[4874]: I0126 00:27:22.449996 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431" gracePeriod=600 Jan 26 00:27:22 crc kubenswrapper[4874]: E0126 00:27:22.582802 4874 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1fd285d_2d8d_4a57_8afa_450025153e32.slice/crio-conmon-9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1fd285d_2d8d_4a57_8afa_450025153e32.slice/crio-9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431.scope\": RecentStats: unable to find data in memory cache]" Jan 26 00:27:23 crc kubenswrapper[4874]: I0126 00:27:23.472537 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431" exitCode=0 Jan 26 00:27:23 crc kubenswrapper[4874]: I0126 00:27:23.472630 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431"} Jan 26 00:27:23 crc kubenswrapper[4874]: I0126 00:27:23.473124 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"b2a635df060307f2a66674b706eebc9ffc65bffa75b2c0f495d13ff906c87634"} Jan 26 00:27:23 crc kubenswrapper[4874]: I0126 00:27:23.473163 4874 scope.go:117] "RemoveContainer" containerID="3ee75d759155b8832b753c645587cef74c69208f7e96da35246abb4b342d40fa" Jan 26 00:28:11 crc kubenswrapper[4874]: I0126 00:28:11.826376 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_317a15bc-5848-41a5-8fa3-b6832e63363e/docker-build/0.log" Jan 26 00:28:11 crc kubenswrapper[4874]: I0126 00:28:11.827845 4874 generic.go:334] "Generic (PLEG): container finished" podID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerID="c1f1c388e02a90f96d407fd940d8afe6f71c7117cec34f5e4c82fad59dbbf84a" exitCode=1 Jan 26 00:28:11 crc kubenswrapper[4874]: I0126 00:28:11.827891 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"317a15bc-5848-41a5-8fa3-b6832e63363e","Type":"ContainerDied","Data":"c1f1c388e02a90f96d407fd940d8afe6f71c7117cec34f5e4c82fad59dbbf84a"} Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.156661 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_317a15bc-5848-41a5-8fa3-b6832e63363e/docker-build/0.log" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.158655 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.342727 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-buildworkdir\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343002 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-run\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343167 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-root\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343314 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clffd\" (UniqueName: \"kubernetes.io/projected/317a15bc-5848-41a5-8fa3-b6832e63363e-kube-api-access-clffd\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343441 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-buildcachedir\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343558 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-push\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343760 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-system-configs\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343863 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-pull\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.343975 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-build-blob-cache\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.344099 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-node-pullsecrets\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.344267 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-proxy-ca-bundles\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.344488 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-ca-bundles\") pod \"317a15bc-5848-41a5-8fa3-b6832e63363e\" (UID: \"317a15bc-5848-41a5-8fa3-b6832e63363e\") " Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.345981 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.346246 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.346275 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.346843 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.347547 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.348421 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.352497 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-push" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-push") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "builder-dockercfg-pt6zh-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.352673 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317a15bc-5848-41a5-8fa3-b6832e63363e-kube-api-access-clffd" (OuterVolumeSpecName: "kube-api-access-clffd") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "kube-api-access-clffd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.352779 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-pull" (OuterVolumeSpecName: "builder-dockercfg-pt6zh-pull") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "builder-dockercfg-pt6zh-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.385192 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.446994 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-push\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-push\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447035 4874 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447047 4874 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-pt6zh-pull\" (UniqueName: \"kubernetes.io/secret/317a15bc-5848-41a5-8fa3-b6832e63363e-builder-dockercfg-pt6zh-pull\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447057 4874 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447068 4874 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447077 4874 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/317a15bc-5848-41a5-8fa3-b6832e63363e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447092 4874 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447105 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447115 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clffd\" (UniqueName: \"kubernetes.io/projected/317a15bc-5848-41a5-8fa3-b6832e63363e-kube-api-access-clffd\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.447127 4874 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/317a15bc-5848-41a5-8fa3-b6832e63363e-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.554972 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.651136 4874 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.844515 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_317a15bc-5848-41a5-8fa3-b6832e63363e/docker-build/0.log" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.848538 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"317a15bc-5848-41a5-8fa3-b6832e63363e","Type":"ContainerDied","Data":"c4b938c8330bb5c978aaac75ce4468865ec8915c5f1985188a514e11aabe1e29"} Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.848598 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4b938c8330bb5c978aaac75ce4468865ec8915c5f1985188a514e11aabe1e29" Jan 26 00:28:13 crc kubenswrapper[4874]: I0126 00:28:13.848664 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 26 00:28:15 crc kubenswrapper[4874]: I0126 00:28:15.225206 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "317a15bc-5848-41a5-8fa3-b6832e63363e" (UID: "317a15bc-5848-41a5-8fa3-b6832e63363e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:28:15 crc kubenswrapper[4874]: I0126 00:28:15.276896 4874 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/317a15bc-5848-41a5-8fa3-b6832e63363e-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.679779 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5mvxb/must-gather-phs8l"] Jan 26 00:29:03 crc kubenswrapper[4874]: E0126 00:29:03.681131 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerName="manage-dockerfile" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.681167 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerName="manage-dockerfile" Jan 26 00:29:03 crc kubenswrapper[4874]: E0126 00:29:03.681185 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerName="docker-build" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.681193 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerName="docker-build" Jan 26 00:29:03 crc kubenswrapper[4874]: E0126 00:29:03.681213 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerName="git-clone" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.681222 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerName="git-clone" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.681384 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="317a15bc-5848-41a5-8fa3-b6832e63363e" containerName="docker-build" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.682331 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.686385 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5mvxb"/"kube-root-ca.crt" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.687947 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5mvxb"/"openshift-service-ca.crt" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.767886 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mvxb/must-gather-phs8l"] Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.796666 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9f2deefa-15cf-4b89-8774-537359264adf-must-gather-output\") pod \"must-gather-phs8l\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.796758 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4rp8\" (UniqueName: \"kubernetes.io/projected/9f2deefa-15cf-4b89-8774-537359264adf-kube-api-access-g4rp8\") pod \"must-gather-phs8l\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.898762 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9f2deefa-15cf-4b89-8774-537359264adf-must-gather-output\") pod \"must-gather-phs8l\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.899253 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4rp8\" (UniqueName: \"kubernetes.io/projected/9f2deefa-15cf-4b89-8774-537359264adf-kube-api-access-g4rp8\") pod \"must-gather-phs8l\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.899314 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9f2deefa-15cf-4b89-8774-537359264adf-must-gather-output\") pod \"must-gather-phs8l\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:03 crc kubenswrapper[4874]: I0126 00:29:03.928124 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4rp8\" (UniqueName: \"kubernetes.io/projected/9f2deefa-15cf-4b89-8774-537359264adf-kube-api-access-g4rp8\") pod \"must-gather-phs8l\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:04 crc kubenswrapper[4874]: I0126 00:29:04.007028 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:29:04 crc kubenswrapper[4874]: I0126 00:29:04.233447 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mvxb/must-gather-phs8l"] Jan 26 00:29:04 crc kubenswrapper[4874]: I0126 00:29:04.243474 4874 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 26 00:29:05 crc kubenswrapper[4874]: I0126 00:29:05.229402 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mvxb/must-gather-phs8l" event={"ID":"9f2deefa-15cf-4b89-8774-537359264adf","Type":"ContainerStarted","Data":"7f3428f5e96c80caaa445a8ee5b8ea63f0f40e71adb2deb8301d08e741c819ff"} Jan 26 00:29:11 crc kubenswrapper[4874]: I0126 00:29:11.283382 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mvxb/must-gather-phs8l" event={"ID":"9f2deefa-15cf-4b89-8774-537359264adf","Type":"ContainerStarted","Data":"5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d"} Jan 26 00:29:11 crc kubenswrapper[4874]: I0126 00:29:11.284411 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mvxb/must-gather-phs8l" event={"ID":"9f2deefa-15cf-4b89-8774-537359264adf","Type":"ContainerStarted","Data":"a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9"} Jan 26 00:29:11 crc kubenswrapper[4874]: I0126 00:29:11.304301 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5mvxb/must-gather-phs8l" podStartSLOduration=1.855152834 podStartE2EDuration="8.304278327s" podCreationTimestamp="2026-01-26 00:29:03 +0000 UTC" firstStartedPulling="2026-01-26 00:29:04.243418701 +0000 UTC m=+1318.977525238" lastFinishedPulling="2026-01-26 00:29:10.692544184 +0000 UTC m=+1325.426650731" observedRunningTime="2026-01-26 00:29:11.302437284 +0000 UTC m=+1326.036543841" watchObservedRunningTime="2026-01-26 00:29:11.304278327 +0000 UTC m=+1326.038384884" Jan 26 00:29:22 crc kubenswrapper[4874]: I0126 00:29:22.444751 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:29:22 crc kubenswrapper[4874]: I0126 00:29:22.445672 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:29:52 crc kubenswrapper[4874]: I0126 00:29:52.444519 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:29:52 crc kubenswrapper[4874]: I0126 00:29:52.445246 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:29:53 crc kubenswrapper[4874]: I0126 00:29:53.026534 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2cnr6_5b287aa8-5b0e-449a-bd5e-8561d3c74287/control-plane-machine-set-operator/0.log" Jan 26 00:29:53 crc kubenswrapper[4874]: I0126 00:29:53.170307 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7wd7t_8077625e-f489-46b7-9b01-66a13c82f649/kube-rbac-proxy/0.log" Jan 26 00:29:53 crc kubenswrapper[4874]: I0126 00:29:53.201437 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7wd7t_8077625e-f489-46b7-9b01-66a13c82f649/machine-api-operator/0.log" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.156723 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28"] Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.159145 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.162365 4874 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.162430 4874 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.166827 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28"] Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.213806 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbphk\" (UniqueName: \"kubernetes.io/projected/c5b8e8be-f61c-4aad-b262-407526e875b2-kube-api-access-nbphk\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.213896 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b8e8be-f61c-4aad-b262-407526e875b2-secret-volume\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.213943 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b8e8be-f61c-4aad-b262-407526e875b2-config-volume\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.315729 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b8e8be-f61c-4aad-b262-407526e875b2-secret-volume\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.315843 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b8e8be-f61c-4aad-b262-407526e875b2-config-volume\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.315905 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbphk\" (UniqueName: \"kubernetes.io/projected/c5b8e8be-f61c-4aad-b262-407526e875b2-kube-api-access-nbphk\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.317141 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b8e8be-f61c-4aad-b262-407526e875b2-config-volume\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.323921 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b8e8be-f61c-4aad-b262-407526e875b2-secret-volume\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.341924 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbphk\" (UniqueName: \"kubernetes.io/projected/c5b8e8be-f61c-4aad-b262-407526e875b2-kube-api-access-nbphk\") pod \"collect-profiles-29489790-xfn28\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.523424 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:00 crc kubenswrapper[4874]: I0126 00:30:00.775134 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28"] Jan 26 00:30:01 crc kubenswrapper[4874]: I0126 00:30:01.636386 4874 generic.go:334] "Generic (PLEG): container finished" podID="c5b8e8be-f61c-4aad-b262-407526e875b2" containerID="08a92d0da8b3124a45d3bc998ba8507f1c8d6c2a276c81a80cffbd92f0175fe2" exitCode=0 Jan 26 00:30:01 crc kubenswrapper[4874]: I0126 00:30:01.636439 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" event={"ID":"c5b8e8be-f61c-4aad-b262-407526e875b2","Type":"ContainerDied","Data":"08a92d0da8b3124a45d3bc998ba8507f1c8d6c2a276c81a80cffbd92f0175fe2"} Jan 26 00:30:01 crc kubenswrapper[4874]: I0126 00:30:01.636469 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" event={"ID":"c5b8e8be-f61c-4aad-b262-407526e875b2","Type":"ContainerStarted","Data":"68b193aa7a422009a27a2d21da637421926d990f97ce57278d1d86439c874e5e"} Jan 26 00:30:02 crc kubenswrapper[4874]: I0126 00:30:02.951850 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.059734 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbphk\" (UniqueName: \"kubernetes.io/projected/c5b8e8be-f61c-4aad-b262-407526e875b2-kube-api-access-nbphk\") pod \"c5b8e8be-f61c-4aad-b262-407526e875b2\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.060019 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b8e8be-f61c-4aad-b262-407526e875b2-secret-volume\") pod \"c5b8e8be-f61c-4aad-b262-407526e875b2\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.060055 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b8e8be-f61c-4aad-b262-407526e875b2-config-volume\") pod \"c5b8e8be-f61c-4aad-b262-407526e875b2\" (UID: \"c5b8e8be-f61c-4aad-b262-407526e875b2\") " Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.060784 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5b8e8be-f61c-4aad-b262-407526e875b2-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5b8e8be-f61c-4aad-b262-407526e875b2" (UID: "c5b8e8be-f61c-4aad-b262-407526e875b2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.066341 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b8e8be-f61c-4aad-b262-407526e875b2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5b8e8be-f61c-4aad-b262-407526e875b2" (UID: "c5b8e8be-f61c-4aad-b262-407526e875b2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.066419 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b8e8be-f61c-4aad-b262-407526e875b2-kube-api-access-nbphk" (OuterVolumeSpecName: "kube-api-access-nbphk") pod "c5b8e8be-f61c-4aad-b262-407526e875b2" (UID: "c5b8e8be-f61c-4aad-b262-407526e875b2"). InnerVolumeSpecName "kube-api-access-nbphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.162518 4874 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b8e8be-f61c-4aad-b262-407526e875b2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.163131 4874 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b8e8be-f61c-4aad-b262-407526e875b2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.163152 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbphk\" (UniqueName: \"kubernetes.io/projected/c5b8e8be-f61c-4aad-b262-407526e875b2-kube-api-access-nbphk\") on node \"crc\" DevicePath \"\"" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.653234 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" event={"ID":"c5b8e8be-f61c-4aad-b262-407526e875b2","Type":"ContainerDied","Data":"68b193aa7a422009a27a2d21da637421926d990f97ce57278d1d86439c874e5e"} Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.653301 4874 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b193aa7a422009a27a2d21da637421926d990f97ce57278d1d86439c874e5e" Jan 26 00:30:03 crc kubenswrapper[4874]: I0126 00:30:03.653310 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29489790-xfn28" Jan 26 00:30:05 crc kubenswrapper[4874]: I0126 00:30:05.648385 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-scjzz_8bf4a267-b4eb-4717-9f5f-6a7b01536bb8/cert-manager-controller/0.log" Jan 26 00:30:05 crc kubenswrapper[4874]: I0126 00:30:05.803720 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-8ddhb_68e61145-f6f9-4493-9d2b-73bda9d3cfad/cert-manager-cainjector/0.log" Jan 26 00:30:05 crc kubenswrapper[4874]: I0126 00:30:05.873604 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-grbvb_595eb0b5-4780-4c76-a7a9-a453173478f0/cert-manager-webhook/0.log" Jan 26 00:30:21 crc kubenswrapper[4874]: I0126 00:30:21.742322 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-nn5n9_e95db833-3b9b-48a8-8e46-6bc5701b400e/prometheus-operator/0.log" Jan 26 00:30:21 crc kubenswrapper[4874]: I0126 00:30:21.910553 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc_524662a5-1e5c-490a-b16c-e3c281242e8c/prometheus-operator-admission-webhook/0.log" Jan 26 00:30:21 crc kubenswrapper[4874]: I0126 00:30:21.971628 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2_70482ca5-c3cf-44cc-bd48-b0bac6edf082/prometheus-operator-admission-webhook/0.log" Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.124875 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-rpfld_771abf14-9afb-4281-b842-047d2cc449d4/operator/0.log" Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.172421 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-h2z2f_2ce3769d-8607-45e8-afd2-84c971d29cb4/perses-operator/0.log" Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.444588 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.445210 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.445287 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.446019 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2a635df060307f2a66674b706eebc9ffc65bffa75b2c0f495d13ff906c87634"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.446091 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://b2a635df060307f2a66674b706eebc9ffc65bffa75b2c0f495d13ff906c87634" gracePeriod=600 Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.854178 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="b2a635df060307f2a66674b706eebc9ffc65bffa75b2c0f495d13ff906c87634" exitCode=0 Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.854721 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"b2a635df060307f2a66674b706eebc9ffc65bffa75b2c0f495d13ff906c87634"} Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.854761 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerStarted","Data":"ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf"} Jan 26 00:30:22 crc kubenswrapper[4874]: I0126 00:30:22.854787 4874 scope.go:117] "RemoveContainer" containerID="9a7abc247aabf047328ecc0c0d044083ed2977de64529411f4495e1c98299431" Jan 26 00:30:36 crc kubenswrapper[4874]: I0126 00:30:36.769258 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt_320748b6-5b88-4e31-b9e6-56263e13c488/util/0.log" Jan 26 00:30:36 crc kubenswrapper[4874]: I0126 00:30:36.904269 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt_320748b6-5b88-4e31-b9e6-56263e13c488/util/0.log" Jan 26 00:30:36 crc kubenswrapper[4874]: I0126 00:30:36.966007 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt_320748b6-5b88-4e31-b9e6-56263e13c488/pull/0.log" Jan 26 00:30:36 crc kubenswrapper[4874]: I0126 00:30:36.971228 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt_320748b6-5b88-4e31-b9e6-56263e13c488/pull/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.121333 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt_320748b6-5b88-4e31-b9e6-56263e13c488/util/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.177609 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt_320748b6-5b88-4e31-b9e6-56263e13c488/extract/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.199880 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5ckgt_320748b6-5b88-4e31-b9e6-56263e13c488/pull/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.324545 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs_1a4e3630-d9f6-442f-9afb-4dff66861ece/util/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.486041 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs_1a4e3630-d9f6-442f-9afb-4dff66861ece/util/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.494278 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs_1a4e3630-d9f6-442f-9afb-4dff66861ece/pull/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.513599 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs_1a4e3630-d9f6-442f-9afb-4dff66861ece/pull/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.716014 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs_1a4e3630-d9f6-442f-9afb-4dff66861ece/util/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.717209 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs_1a4e3630-d9f6-442f-9afb-4dff66861ece/pull/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.728766 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fdwmqs_1a4e3630-d9f6-442f-9afb-4dff66861ece/extract/0.log" Jan 26 00:30:37 crc kubenswrapper[4874]: I0126 00:30:37.899414 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq_a29bf9c8-7825-4a6e-9ed4-84c1da217e0f/util/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.163343 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq_a29bf9c8-7825-4a6e-9ed4-84c1da217e0f/pull/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.187287 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq_a29bf9c8-7825-4a6e-9ed4-84c1da217e0f/pull/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.194736 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq_a29bf9c8-7825-4a6e-9ed4-84c1da217e0f/util/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.433401 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq_a29bf9c8-7825-4a6e-9ed4-84c1da217e0f/pull/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.433774 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq_a29bf9c8-7825-4a6e-9ed4-84c1da217e0f/util/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.439165 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esb4qq_a29bf9c8-7825-4a6e-9ed4-84c1da217e0f/extract/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.634744 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg_e605a81e-2edf-416f-a26d-3967c0d4e15e/util/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.800600 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg_e605a81e-2edf-416f-a26d-3967c0d4e15e/util/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.814060 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg_e605a81e-2edf-416f-a26d-3967c0d4e15e/pull/0.log" Jan 26 00:30:38 crc kubenswrapper[4874]: I0126 00:30:38.822849 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg_e605a81e-2edf-416f-a26d-3967c0d4e15e/pull/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.007169 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg_e605a81e-2edf-416f-a26d-3967c0d4e15e/extract/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.034166 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg_e605a81e-2edf-416f-a26d-3967c0d4e15e/util/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.035230 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cnllg_e605a81e-2edf-416f-a26d-3967c0d4e15e/pull/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.197498 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g84nj_eec2d818-be5e-4803-9725-bb1565d2ba99/extract-utilities/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.436131 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g84nj_eec2d818-be5e-4803-9725-bb1565d2ba99/extract-content/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.440786 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g84nj_eec2d818-be5e-4803-9725-bb1565d2ba99/extract-utilities/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.456731 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g84nj_eec2d818-be5e-4803-9725-bb1565d2ba99/extract-content/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.700249 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g84nj_eec2d818-be5e-4803-9725-bb1565d2ba99/extract-utilities/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.706648 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g84nj_eec2d818-be5e-4803-9725-bb1565d2ba99/extract-content/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.927369 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r9jch_5432e358-16b1-407d-9b0a-e07fa48e6a0d/extract-utilities/0.log" Jan 26 00:30:39 crc kubenswrapper[4874]: I0126 00:30:39.934928 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g84nj_eec2d818-be5e-4803-9725-bb1565d2ba99/registry-server/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.170419 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r9jch_5432e358-16b1-407d-9b0a-e07fa48e6a0d/extract-content/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.172799 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r9jch_5432e358-16b1-407d-9b0a-e07fa48e6a0d/extract-content/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.181523 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r9jch_5432e358-16b1-407d-9b0a-e07fa48e6a0d/extract-utilities/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.609716 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r9jch_5432e358-16b1-407d-9b0a-e07fa48e6a0d/extract-content/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.700044 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r9jch_5432e358-16b1-407d-9b0a-e07fa48e6a0d/extract-utilities/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.758826 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r9jch_5432e358-16b1-407d-9b0a-e07fa48e6a0d/registry-server/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.931475 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbqx6_4463e26e-6510-4fe5-8e96-c3035234bd88/extract-utilities/0.log" Jan 26 00:30:40 crc kubenswrapper[4874]: I0126 00:30:40.968721 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qbtnf_5724916b-98d3-4a48-8da3-b81427657599/marketplace-operator/0.log" Jan 26 00:30:41 crc kubenswrapper[4874]: I0126 00:30:41.125916 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbqx6_4463e26e-6510-4fe5-8e96-c3035234bd88/extract-utilities/0.log" Jan 26 00:30:41 crc kubenswrapper[4874]: I0126 00:30:41.134896 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbqx6_4463e26e-6510-4fe5-8e96-c3035234bd88/extract-content/0.log" Jan 26 00:30:41 crc kubenswrapper[4874]: I0126 00:30:41.153903 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbqx6_4463e26e-6510-4fe5-8e96-c3035234bd88/extract-content/0.log" Jan 26 00:30:41 crc kubenswrapper[4874]: I0126 00:30:41.357617 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbqx6_4463e26e-6510-4fe5-8e96-c3035234bd88/extract-utilities/0.log" Jan 26 00:30:41 crc kubenswrapper[4874]: I0126 00:30:41.394598 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbqx6_4463e26e-6510-4fe5-8e96-c3035234bd88/extract-content/0.log" Jan 26 00:30:41 crc kubenswrapper[4874]: I0126 00:30:41.536737 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbqx6_4463e26e-6510-4fe5-8e96-c3035234bd88/registry-server/0.log" Jan 26 00:30:56 crc kubenswrapper[4874]: I0126 00:30:56.122419 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-nn5n9_e95db833-3b9b-48a8-8e46-6bc5701b400e/prometheus-operator/0.log" Jan 26 00:30:56 crc kubenswrapper[4874]: I0126 00:30:56.302217 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6cbb89c5b-ck6jc_524662a5-1e5c-490a-b16c-e3c281242e8c/prometheus-operator-admission-webhook/0.log" Jan 26 00:30:56 crc kubenswrapper[4874]: I0126 00:30:56.360528 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6cbb89c5b-jx9h2_70482ca5-c3cf-44cc-bd48-b0bac6edf082/prometheus-operator-admission-webhook/0.log" Jan 26 00:30:56 crc kubenswrapper[4874]: I0126 00:30:56.486364 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-h2z2f_2ce3769d-8607-45e8-afd2-84c971d29cb4/perses-operator/0.log" Jan 26 00:30:56 crc kubenswrapper[4874]: I0126 00:30:56.489112 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-rpfld_771abf14-9afb-4281-b842-047d2cc449d4/operator/0.log" Jan 26 00:32:01 crc kubenswrapper[4874]: I0126 00:32:01.447200 4874 generic.go:334] "Generic (PLEG): container finished" podID="9f2deefa-15cf-4b89-8774-537359264adf" containerID="a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9" exitCode=0 Jan 26 00:32:01 crc kubenswrapper[4874]: I0126 00:32:01.447337 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mvxb/must-gather-phs8l" event={"ID":"9f2deefa-15cf-4b89-8774-537359264adf","Type":"ContainerDied","Data":"a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9"} Jan 26 00:32:01 crc kubenswrapper[4874]: I0126 00:32:01.448267 4874 scope.go:117] "RemoveContainer" containerID="a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9" Jan 26 00:32:01 crc kubenswrapper[4874]: I0126 00:32:01.952529 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mvxb_must-gather-phs8l_9f2deefa-15cf-4b89-8774-537359264adf/gather/0.log" Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.475461 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5mvxb/must-gather-phs8l"] Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.476565 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5mvxb/must-gather-phs8l" podUID="9f2deefa-15cf-4b89-8774-537359264adf" containerName="copy" containerID="cri-o://5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d" gracePeriod=2 Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.481216 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5mvxb/must-gather-phs8l"] Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.926115 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mvxb_must-gather-phs8l_9f2deefa-15cf-4b89-8774-537359264adf/copy/0.log" Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.926743 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.953958 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9f2deefa-15cf-4b89-8774-537359264adf-must-gather-output\") pod \"9f2deefa-15cf-4b89-8774-537359264adf\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.954424 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4rp8\" (UniqueName: \"kubernetes.io/projected/9f2deefa-15cf-4b89-8774-537359264adf-kube-api-access-g4rp8\") pod \"9f2deefa-15cf-4b89-8774-537359264adf\" (UID: \"9f2deefa-15cf-4b89-8774-537359264adf\") " Jan 26 00:32:08 crc kubenswrapper[4874]: I0126 00:32:08.960291 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2deefa-15cf-4b89-8774-537359264adf-kube-api-access-g4rp8" (OuterVolumeSpecName: "kube-api-access-g4rp8") pod "9f2deefa-15cf-4b89-8774-537359264adf" (UID: "9f2deefa-15cf-4b89-8774-537359264adf"). InnerVolumeSpecName "kube-api-access-g4rp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.025455 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f2deefa-15cf-4b89-8774-537359264adf-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9f2deefa-15cf-4b89-8774-537359264adf" (UID: "9f2deefa-15cf-4b89-8774-537359264adf"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.057435 4874 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9f2deefa-15cf-4b89-8774-537359264adf-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.057466 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4rp8\" (UniqueName: \"kubernetes.io/projected/9f2deefa-15cf-4b89-8774-537359264adf-kube-api-access-g4rp8\") on node \"crc\" DevicePath \"\"" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.532541 4874 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mvxb_must-gather-phs8l_9f2deefa-15cf-4b89-8774-537359264adf/copy/0.log" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.534560 4874 generic.go:334] "Generic (PLEG): container finished" podID="9f2deefa-15cf-4b89-8774-537359264adf" containerID="5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d" exitCode=143 Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.534661 4874 scope.go:117] "RemoveContainer" containerID="5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.534849 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mvxb/must-gather-phs8l" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.572871 4874 scope.go:117] "RemoveContainer" containerID="a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.629008 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2deefa-15cf-4b89-8774-537359264adf" path="/var/lib/kubelet/pods/9f2deefa-15cf-4b89-8774-537359264adf/volumes" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.633272 4874 scope.go:117] "RemoveContainer" containerID="5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d" Jan 26 00:32:09 crc kubenswrapper[4874]: E0126 00:32:09.633725 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d\": container with ID starting with 5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d not found: ID does not exist" containerID="5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.633774 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d"} err="failed to get container status \"5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d\": rpc error: code = NotFound desc = could not find container \"5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d\": container with ID starting with 5d06845d22ee36cba5d0a86fdd1b8c680ce4232a5e998cd0e027ab61ada5143d not found: ID does not exist" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.633798 4874 scope.go:117] "RemoveContainer" containerID="a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9" Jan 26 00:32:09 crc kubenswrapper[4874]: E0126 00:32:09.634077 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9\": container with ID starting with a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9 not found: ID does not exist" containerID="a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9" Jan 26 00:32:09 crc kubenswrapper[4874]: I0126 00:32:09.634122 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9"} err="failed to get container status \"a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9\": rpc error: code = NotFound desc = could not find container \"a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9\": container with ID starting with a92404d237d5585e2ca522d41d1fa50a27b10aaa4a528262cd80f8d3640c28a9 not found: ID does not exist" Jan 26 00:32:22 crc kubenswrapper[4874]: I0126 00:32:22.444740 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:32:22 crc kubenswrapper[4874]: I0126 00:32:22.446405 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:32:52 crc kubenswrapper[4874]: I0126 00:32:52.444467 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:32:52 crc kubenswrapper[4874]: I0126 00:32:52.445099 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:33:22 crc kubenswrapper[4874]: I0126 00:33:22.444183 4874 patch_prober.go:28] interesting pod/machine-config-daemon-ffp42 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 26 00:33:22 crc kubenswrapper[4874]: I0126 00:33:22.444916 4874 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 26 00:33:22 crc kubenswrapper[4874]: I0126 00:33:22.445087 4874 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" Jan 26 00:33:22 crc kubenswrapper[4874]: I0126 00:33:22.445807 4874 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf"} pod="openshift-machine-config-operator/machine-config-daemon-ffp42" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 26 00:33:22 crc kubenswrapper[4874]: I0126 00:33:22.445872 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerName="machine-config-daemon" containerID="cri-o://ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" gracePeriod=600 Jan 26 00:33:22 crc kubenswrapper[4874]: E0126 00:33:22.585422 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:33:23 crc kubenswrapper[4874]: I0126 00:33:23.129946 4874 generic.go:334] "Generic (PLEG): container finished" podID="b1fd285d-2d8d-4a57-8afa-450025153e32" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" exitCode=0 Jan 26 00:33:23 crc kubenswrapper[4874]: I0126 00:33:23.130415 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" event={"ID":"b1fd285d-2d8d-4a57-8afa-450025153e32","Type":"ContainerDied","Data":"ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf"} Jan 26 00:33:23 crc kubenswrapper[4874]: I0126 00:33:23.130602 4874 scope.go:117] "RemoveContainer" containerID="b2a635df060307f2a66674b706eebc9ffc65bffa75b2c0f495d13ff906c87634" Jan 26 00:33:23 crc kubenswrapper[4874]: I0126 00:33:23.131089 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:33:23 crc kubenswrapper[4874]: E0126 00:33:23.131351 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.121505 4874 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hc6d8"] Jan 26 00:33:26 crc kubenswrapper[4874]: E0126 00:33:26.122163 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2deefa-15cf-4b89-8774-537359264adf" containerName="copy" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.122179 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2deefa-15cf-4b89-8774-537359264adf" containerName="copy" Jan 26 00:33:26 crc kubenswrapper[4874]: E0126 00:33:26.122196 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b8e8be-f61c-4aad-b262-407526e875b2" containerName="collect-profiles" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.122204 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b8e8be-f61c-4aad-b262-407526e875b2" containerName="collect-profiles" Jan 26 00:33:26 crc kubenswrapper[4874]: E0126 00:33:26.122233 4874 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2deefa-15cf-4b89-8774-537359264adf" containerName="gather" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.122242 4874 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2deefa-15cf-4b89-8774-537359264adf" containerName="gather" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.122390 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2deefa-15cf-4b89-8774-537359264adf" containerName="copy" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.122409 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b8e8be-f61c-4aad-b262-407526e875b2" containerName="collect-profiles" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.122428 4874 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2deefa-15cf-4b89-8774-537359264adf" containerName="gather" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.123449 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.138294 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hc6d8"] Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.241011 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-utilities\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.241270 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-catalog-content\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.241397 4874 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld87q\" (UniqueName: \"kubernetes.io/projected/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-kube-api-access-ld87q\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.342357 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-catalog-content\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.343034 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-catalog-content\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.343174 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld87q\" (UniqueName: \"kubernetes.io/projected/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-kube-api-access-ld87q\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.343261 4874 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-utilities\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.343535 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-utilities\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.362726 4874 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld87q\" (UniqueName: \"kubernetes.io/projected/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-kube-api-access-ld87q\") pod \"community-operators-hc6d8\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.442515 4874 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:26 crc kubenswrapper[4874]: I0126 00:33:26.996921 4874 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hc6d8"] Jan 26 00:33:27 crc kubenswrapper[4874]: W0126 00:33:27.003092 4874 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd24a9a9_bd78_4bd7_beba_3f119b9bf897.slice/crio-1e4f6cb225ef0f4f9377153b2d4d89de94cd2e615d8318663cfdb4250b30d8fe WatchSource:0}: Error finding container 1e4f6cb225ef0f4f9377153b2d4d89de94cd2e615d8318663cfdb4250b30d8fe: Status 404 returned error can't find the container with id 1e4f6cb225ef0f4f9377153b2d4d89de94cd2e615d8318663cfdb4250b30d8fe Jan 26 00:33:27 crc kubenswrapper[4874]: I0126 00:33:27.164047 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hc6d8" event={"ID":"fd24a9a9-bd78-4bd7-beba-3f119b9bf897","Type":"ContainerStarted","Data":"8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0"} Jan 26 00:33:27 crc kubenswrapper[4874]: I0126 00:33:27.164134 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hc6d8" event={"ID":"fd24a9a9-bd78-4bd7-beba-3f119b9bf897","Type":"ContainerStarted","Data":"1e4f6cb225ef0f4f9377153b2d4d89de94cd2e615d8318663cfdb4250b30d8fe"} Jan 26 00:33:28 crc kubenswrapper[4874]: I0126 00:33:28.176442 4874 generic.go:334] "Generic (PLEG): container finished" podID="fd24a9a9-bd78-4bd7-beba-3f119b9bf897" containerID="8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0" exitCode=0 Jan 26 00:33:28 crc kubenswrapper[4874]: I0126 00:33:28.176656 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hc6d8" event={"ID":"fd24a9a9-bd78-4bd7-beba-3f119b9bf897","Type":"ContainerDied","Data":"8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0"} Jan 26 00:33:30 crc kubenswrapper[4874]: I0126 00:33:30.222217 4874 generic.go:334] "Generic (PLEG): container finished" podID="fd24a9a9-bd78-4bd7-beba-3f119b9bf897" containerID="04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34" exitCode=0 Jan 26 00:33:30 crc kubenswrapper[4874]: I0126 00:33:30.222287 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hc6d8" event={"ID":"fd24a9a9-bd78-4bd7-beba-3f119b9bf897","Type":"ContainerDied","Data":"04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34"} Jan 26 00:33:31 crc kubenswrapper[4874]: I0126 00:33:31.230153 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hc6d8" event={"ID":"fd24a9a9-bd78-4bd7-beba-3f119b9bf897","Type":"ContainerStarted","Data":"1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c"} Jan 26 00:33:31 crc kubenswrapper[4874]: I0126 00:33:31.247360 4874 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hc6d8" podStartSLOduration=2.659879899 podStartE2EDuration="5.247338576s" podCreationTimestamp="2026-01-26 00:33:26 +0000 UTC" firstStartedPulling="2026-01-26 00:33:28.178726157 +0000 UTC m=+1582.912832694" lastFinishedPulling="2026-01-26 00:33:30.766184834 +0000 UTC m=+1585.500291371" observedRunningTime="2026-01-26 00:33:31.246266286 +0000 UTC m=+1585.980372813" watchObservedRunningTime="2026-01-26 00:33:31.247338576 +0000 UTC m=+1585.981445113" Jan 26 00:33:35 crc kubenswrapper[4874]: I0126 00:33:35.625410 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:33:35 crc kubenswrapper[4874]: E0126 00:33:35.626089 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:33:36 crc kubenswrapper[4874]: I0126 00:33:36.443944 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:36 crc kubenswrapper[4874]: I0126 00:33:36.444019 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:36 crc kubenswrapper[4874]: I0126 00:33:36.518569 4874 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:37 crc kubenswrapper[4874]: I0126 00:33:37.322997 4874 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:37 crc kubenswrapper[4874]: I0126 00:33:37.375199 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hc6d8"] Jan 26 00:33:39 crc kubenswrapper[4874]: I0126 00:33:39.295897 4874 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hc6d8" podUID="fd24a9a9-bd78-4bd7-beba-3f119b9bf897" containerName="registry-server" containerID="cri-o://1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c" gracePeriod=2 Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.262048 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.265786 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-catalog-content\") pod \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.265890 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-utilities\") pod \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.265924 4874 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld87q\" (UniqueName: \"kubernetes.io/projected/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-kube-api-access-ld87q\") pod \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\" (UID: \"fd24a9a9-bd78-4bd7-beba-3f119b9bf897\") " Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.268520 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-utilities" (OuterVolumeSpecName: "utilities") pod "fd24a9a9-bd78-4bd7-beba-3f119b9bf897" (UID: "fd24a9a9-bd78-4bd7-beba-3f119b9bf897"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.274148 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-kube-api-access-ld87q" (OuterVolumeSpecName: "kube-api-access-ld87q") pod "fd24a9a9-bd78-4bd7-beba-3f119b9bf897" (UID: "fd24a9a9-bd78-4bd7-beba-3f119b9bf897"). InnerVolumeSpecName "kube-api-access-ld87q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.306302 4874 generic.go:334] "Generic (PLEG): container finished" podID="fd24a9a9-bd78-4bd7-beba-3f119b9bf897" containerID="1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c" exitCode=0 Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.306353 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hc6d8" event={"ID":"fd24a9a9-bd78-4bd7-beba-3f119b9bf897","Type":"ContainerDied","Data":"1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c"} Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.306384 4874 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hc6d8" event={"ID":"fd24a9a9-bd78-4bd7-beba-3f119b9bf897","Type":"ContainerDied","Data":"1e4f6cb225ef0f4f9377153b2d4d89de94cd2e615d8318663cfdb4250b30d8fe"} Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.306401 4874 scope.go:117] "RemoveContainer" containerID="1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.306415 4874 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hc6d8" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.331039 4874 scope.go:117] "RemoveContainer" containerID="04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.348173 4874 scope.go:117] "RemoveContainer" containerID="8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.367175 4874 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-utilities\") on node \"crc\" DevicePath \"\"" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.367214 4874 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld87q\" (UniqueName: \"kubernetes.io/projected/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-kube-api-access-ld87q\") on node \"crc\" DevicePath \"\"" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.370385 4874 scope.go:117] "RemoveContainer" containerID="1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c" Jan 26 00:33:40 crc kubenswrapper[4874]: E0126 00:33:40.370880 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c\": container with ID starting with 1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c not found: ID does not exist" containerID="1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.370924 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c"} err="failed to get container status \"1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c\": rpc error: code = NotFound desc = could not find container \"1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c\": container with ID starting with 1d43fd8fd05eacc6396b7efc5cb94301f66d3cf38d57744565490c9d5759401c not found: ID does not exist" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.370970 4874 scope.go:117] "RemoveContainer" containerID="04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34" Jan 26 00:33:40 crc kubenswrapper[4874]: E0126 00:33:40.371257 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34\": container with ID starting with 04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34 not found: ID does not exist" containerID="04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.371284 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34"} err="failed to get container status \"04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34\": rpc error: code = NotFound desc = could not find container \"04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34\": container with ID starting with 04ec18764f48539f5248a91d913f0da0faadf92c0705552610e9701970db9e34 not found: ID does not exist" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.371309 4874 scope.go:117] "RemoveContainer" containerID="8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0" Jan 26 00:33:40 crc kubenswrapper[4874]: E0126 00:33:40.371659 4874 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0\": container with ID starting with 8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0 not found: ID does not exist" containerID="8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.371709 4874 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0"} err="failed to get container status \"8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0\": rpc error: code = NotFound desc = could not find container \"8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0\": container with ID starting with 8986c603a7603afb4d904d4e0bbcdf188658eb98fdf6abd9c290df0786bccbf0 not found: ID does not exist" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.377093 4874 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd24a9a9-bd78-4bd7-beba-3f119b9bf897" (UID: "fd24a9a9-bd78-4bd7-beba-3f119b9bf897"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.468328 4874 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd24a9a9-bd78-4bd7-beba-3f119b9bf897-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.639301 4874 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hc6d8"] Jan 26 00:33:40 crc kubenswrapper[4874]: I0126 00:33:40.647160 4874 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hc6d8"] Jan 26 00:33:41 crc kubenswrapper[4874]: I0126 00:33:41.631086 4874 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd24a9a9-bd78-4bd7-beba-3f119b9bf897" path="/var/lib/kubelet/pods/fd24a9a9-bd78-4bd7-beba-3f119b9bf897/volumes" Jan 26 00:33:47 crc kubenswrapper[4874]: I0126 00:33:47.621126 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:33:47 crc kubenswrapper[4874]: E0126 00:33:47.622208 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:34:02 crc kubenswrapper[4874]: I0126 00:34:02.621004 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:34:02 crc kubenswrapper[4874]: E0126 00:34:02.624272 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:34:14 crc kubenswrapper[4874]: I0126 00:34:14.620697 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:34:14 crc kubenswrapper[4874]: E0126 00:34:14.621459 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:34:25 crc kubenswrapper[4874]: I0126 00:34:25.626374 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:34:25 crc kubenswrapper[4874]: E0126 00:34:25.627512 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:34:40 crc kubenswrapper[4874]: I0126 00:34:40.621573 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:34:40 crc kubenswrapper[4874]: E0126 00:34:40.622719 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" Jan 26 00:34:51 crc kubenswrapper[4874]: I0126 00:34:51.620834 4874 scope.go:117] "RemoveContainer" containerID="ff244b4cc4be5444f8c25d5e647e81127efae7368372072d69bd1d493b01dddf" Jan 26 00:34:51 crc kubenswrapper[4874]: E0126 00:34:51.621459 4874 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ffp42_openshift-machine-config-operator(b1fd285d-2d8d-4a57-8afa-450025153e32)\"" pod="openshift-machine-config-operator/machine-config-daemon-ffp42" podUID="b1fd285d-2d8d-4a57-8afa-450025153e32" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515135533276024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015135533276017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015135527531016514 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015135527531015464 5ustar corecore